Sounds cool! Thanks! Making my own microchip out of logic gates seems to be a confusing task though. Simply because I do not understand how the logic gates in a CPU manage to carry out executions. (if and then statements in terms of logic gates :~ )
I will definetely read more into it. I have a book that explains the advance mechanics of FPGAs but it failed to explain the basics, so i was left dumbfounded :| But now I know. :)
It can be -very- confusing, based on what I have researched.
Programming is typically done using VHDL (or something like it):
...which is really similar to C coding - and at the same time, more complex. One of the big things to wrap your head around is that processes on the chip (just like any logic chip) happen in parallel - not in a serial fashion - so multiple "functions" defined in your code could be processing both at the -exact same time- and "in step" with the "clock" (which is typically just another signal from the outside, but I would imagine that there are some FPGAs with built-in clock circuitry).
To wrap your head around how a CPU works (at the very basic level - no way to understand things like pipelining and other "modern" CPU stuff without knowing the basics) - you have to have an understanding of basic logic circuitry, and know also that there is a clock (a square-wave pulse of a certain duration and frequency), and that as this clock "ticks" (and actually, it is divided up into multiple staggered "clock" lines, to time the various parts of the logic to various parts of the cycle), it increments a counter (which is a fairly simple bit of logic of its own, but essentially it takes a pulse, and increments (and outputs) an n-bit set of lines high and low to represent a binary number), and that counter points to an address in memory (this could be termed the "program counter"). That address holds some data, called an op-code - which is just another number in binary with high/low states, which sets some other lines in the circuitry. On the next pulse, maybe those lines are set to say "at the next pulse move whatever is read at the address into this memory cell" - or something like that.
If/then clauses are managed (generally) by a register called the "flag register" which contains multiple bits that are set/reset by various portions of the CPU, and certain instructions read this register, and based on the flag, either "jump" (reset the program counter to an address pointed to by the data at that clock tick) or continue to the next address on the next clock tick - generally, the flag will be known as the "zero flag" which indicates if a certain register (the accumulator) reaches zero; sometimes the instruction might be to check the "overflow flag" if the accumulator wraps past its maximum value back to zero (note that in a CPU, there's no such thing as negative numbers or subtraction - its all addition - you'll have to look up "2's complement arithmetic" to see how that trick works). There are other flags for other things - it all depends on how the CPU works.
The above is just a real basic idea of how things work in a simple CPU - learning how one really works (and I reccommend learning some form of assembler, and possibly machine coding - to really "get it" - while you are reading about CPUs, etc) - can be a real treat, and give you great respect (and amazement) at these devices we call "computers".
If you want some other interesting reading - look into how old player pianos work. In effect, the perforated "roll" acts as a "program" with the holes the "instructions" - some player pianos even had limited "branching" instructions (as well as other things to change the internal operation of the machine as it worked). Of course, you should also expand your inquiry into how certain other mechanical machines, like the Jaquard Loom, and Charles Babbage's machines (as well as Herman Hollerith and the 1890 United States Census!) all worked. Remember that long before there was electronic memory of any sort, computer-like processing was done with essentially the equivalent of punched cards!
Have you ever wondered where the row/column size of IBM monitors came from (80x25)? From a punch card! The original IBM punch cards had 80 columns by 25 rows of data they could store (which of course grew out of the original Hollerith punch cards!) - and since punch cards were still being used widely (even into the 1980s and early 1990s - though by then they were all mostly converted to tape and other storage), they needed a way to see them on video monitors, so the standard of "80x25" was picked, and is still with us today. Much of what we know as "ASCII" has roots in old teletypes (and earlier ticker tape machines, as well as morse code!).
I'm cutting this off now - as you can see, I have an affinity for this kind of history; I find it utterly fascinating. I also find it tragic, in a way, that this history of these amazing machines, which are changing our lives in ways unimaginable even 20 years ago (and in ways which we still can't imagine); machines which are woven into every facet of our lives (indeed, for some of us, they are woven into our bodies by skilled surgeons to keep us alive!) - that thier history is relegated as a mere footnote in our knowledge (as compared to other similar machines with such broad impacts, like the automobile or airplane) - when it is taught at all.
What's even more amazing is that these machines perform all of what you see done with computers - using nothing more than small voltage level changes, NOR/NAND logic (look this up, too - did you know you can build any Turing Complete machine with a NAND gate?), and simple addition. Gah - just sitting here thinking about this; thinking about the minds of Church, Turing, and other great mathematical minds who each built upon the other to come up with these machines (wow - and Church/Turing freed us from thinking they were only for -calculation-, which was the paradigm of thought for over 100 years!). Heh - what about the contributions of women! From Lady Ada (who really forsaw what computers could be used for - music composition, for instance!), to the original "computers" (sadly, most of them unamed), to Grace Hopper (thank her for the first high level language, COBOL! Among other things). I'm sorry - I find it hard to stop; it's a passion. All of humanity should be dancing in the street at all of this. Most people don't understand what it is that computers are, what they can be, what they may yet become.
...and what they are rapidly encroaching upon.