Go Down

Topic: What is FPGA? (Read 4291 times) previous topic - next topic

Catcher

What is Field-Programmable-Gate Array?
Wikipedia - information overload
Google - no links to information other than wiki

Its something about programming your own logic gates. What I don't understand is the details. How is it used? What might one do with it? Arduino-compatible? How to program?

Thanks!

frank26080115

#1
Jul 18, 2011, 06:59 am Last Edit: Jul 18, 2011, 07:01 am by frank26080115 Reason: 1
You are not looking hard enough.

I think of it as this: do you want to build your own microchip? too bad, that cost millions of dollars. but what if you had a microchip that can turn its insides into any microchip? then you can make your own microchip at home. (keep in mind that it has to be all digital circuitry, with some limitations with respect to speed, memory, complexity, and other things)

Usually programmed at power-up from various sources, maybe a fixed ROM/EEPROM/flash memory, or maybe from a microcontroller, or JTAG. You basically load in a "schematic" for the innards of the chip (not really but I don't want to get into LUTs and stuff yet). This schematic can be created as a graphical schematic, or "described" with HDL (hardware description language).

liuzengqiang

What I heard about FPGA is that if you have a good one with enough gates, you can program it into a pentium if you have the circuit diagram of pentium. But later you can also turn it into a video card if, again you have the diagram of a video card. They seem to be circuit/chip blanks that can be made into anything you want within their own complexity.
Serial LCD keypad panel,phi_prompt user interface library,SDI-12 USB Adapter

retrolefty

I seem to recall reading once about one FPGA that included a ucontroller in it's die, 8051 core based I think. That would be combining the advantage of firmware programmablity along with custom gate interface capablities of a standard FPGA. Still not too hobby friendly, I suspect all the tool chain requirements for a FPGA would be proprietary and cost big bucks to play?

Lefty

AWOL

Quote
I seem to recall reading once about one FPGA that included a ucontroller in it's die, 8051 core based I think

Xilinx do a PowerPC core for their Virtex device family.
That's serious processing power!

a.d


Still not too hobby friendly, I suspect all the tool chain requirements for a FPGA would be proprietary and cost big bucks to play?


I think the Xilinx stuff is kind of free but limited in functionality somewhat. Not sure though. Either way it's a 4GB beast download.

frank26080115


I seem to recall reading once about one FPGA that included a ucontroller in it's die, 8051 core based I think. That would be combining the advantage of firmware programmablity along with custom gate interface capablities of a standard FPGA. Still not too hobby friendly, I suspect all the tool chain requirements for a FPGA would be proprietary and cost big bucks to play?

Lefty



Some chips have the microcontroller core on the die, and the die is surrounded by FPGA fabric, which means the microcontroller core cannot be modified but the peripheral modules can be.

Sometimes people have the code to make the microcontroller using the FPGA fabric, which means you can modify the microcontroller core itself (if the code is not encrypted, many are).

Catcher

Sounds cool! Thanks! Making my own microchip out of logic gates seems to be a confusing task though. Simply because I do not understand how the logic gates in a CPU manage to carry out executions. (if and then statements in terms of logic gates  :~ )

I will definetely read more into it. I have a book that explains the advance mechanics of FPGAs but it failed to explain the basics, so i was left dumbfounded :| But now I know.  :)

JChristensen


I think of it as this: do you want to build your own microchip? too bad, that cost millions of dollars.


I once heard it described as "the first one costs millions of dollars, the rest are a nickel each". XD

keeper63


Sounds cool! Thanks! Making my own microchip out of logic gates seems to be a confusing task though. Simply because I do not understand how the logic gates in a CPU manage to carry out executions. (if and then statements in terms of logic gates  :~ )

I will definetely read more into it. I have a book that explains the advance mechanics of FPGAs but it failed to explain the basics, so i was left dumbfounded :| But now I know.  :)


It can be -very- confusing, based on what I have researched.

Programming is typically done using VHDL (or something like it):

http://en.wikipedia.org/wiki/VHDL

...which is really similar to C coding - and at the same time, more complex. One of the big things to wrap your head around is that processes on the chip (just like any logic chip) happen in parallel - not in a serial fashion - so multiple "functions" defined in your code could be processing both at the -exact same time- and "in step" with the "clock" (which is typically just another signal from the outside, but I would imagine that there are some FPGAs with built-in clock circuitry).

To wrap your head around how a CPU works (at the very basic level - no way to understand things like pipelining and other "modern" CPU stuff without knowing the basics)  - you have to have an understanding of basic logic circuitry, and know also that there is a clock (a square-wave pulse of a certain duration and frequency), and that as this clock "ticks" (and actually, it is divided up into multiple staggered "clock" lines, to time the various parts of the logic to various parts of the cycle), it increments a counter (which is a fairly simple bit of logic of its own, but essentially it takes a pulse, and increments (and outputs) an n-bit set of lines high and low to represent a binary number), and that counter points to an address in memory (this could be termed the "program counter"). That address holds some data, called an op-code - which is just another number in binary with high/low states, which sets some other lines in the circuitry. On the next pulse, maybe those lines are set to say "at the next pulse move whatever is read at the address into this memory cell" - or something like that.

If/then clauses are managed (generally) by a register called the "flag register" which contains multiple bits that are set/reset by various portions of the CPU, and certain instructions read this register, and based on the flag, either "jump" (reset the program counter to an address pointed to by the data at that clock tick) or continue to the next address on the next clock tick - generally, the flag will be known as the "zero flag" which indicates if a certain register (the accumulator) reaches zero; sometimes the instruction might be to check the "overflow flag" if the accumulator wraps past its maximum value back to zero (note that in a CPU, there's no such thing as negative numbers or subtraction - its all addition - you'll have to look up "2's complement arithmetic" to see how that trick works). There are other flags for other things - it all depends on how the CPU works.

The above is just a real basic idea of how things work in a simple CPU - learning how one really works (and I reccommend learning some form of assembler, and possibly machine coding - to really "get it" - while you are reading about CPUs, etc) - can be a real treat, and give you great respect (and amazement) at these devices we call "computers".

If you want some other interesting reading - look into how old player pianos work. In effect, the perforated "roll" acts as a "program" with the holes the "instructions" - some player pianos even had limited "branching" instructions (as well as other things to change the internal operation of the machine as it worked). Of course, you should also expand your inquiry into how certain other mechanical machines, like the Jaquard Loom, and Charles Babbage's machines (as well as Herman Hollerith and the 1890 United States Census!) all worked. Remember that long before there was electronic memory of any sort, computer-like processing was done with essentially the equivalent of punched cards!

Have you ever wondered where the row/column size of IBM monitors came from (80x25)? From a punch card! The original IBM punch cards had 80 columns by 25 rows of data they could store (which of course grew out of the original Hollerith punch cards!) - and since punch cards were still being used widely (even into the 1980s and early 1990s - though by then they were all mostly converted to tape and other storage), they needed a way to see them on video monitors, so the standard of "80x25" was picked, and is still with us today. Much of what we know as "ASCII" has roots in old teletypes (and earlier ticker tape machines, as well as morse code!).

I'm cutting this off now - as you can see, I have an affinity for this kind of history; I find it utterly fascinating. I also find it tragic, in a way, that this history of these amazing machines, which are changing our lives in ways unimaginable even 20 years ago (and in ways which we still can't imagine); machines which are woven into every facet of our lives (indeed, for some of us, they are woven into our bodies by skilled surgeons to keep us alive!) - that thier history is relegated as a mere footnote in our knowledge (as compared to other similar machines with such broad impacts, like the automobile or airplane) - when it is taught at all.

What's even more amazing is that these machines perform all of what you see done with computers - using nothing more than small voltage level changes, NOR/NAND logic (look this up, too - did you know you can build any Turing Complete machine with a NAND gate?), and simple addition. Gah - just sitting here thinking about this; thinking about the minds of Church, Turing, and other great mathematical minds who each built upon the other to come up with these machines (wow - and Church/Turing freed us from thinking they were only for -calculation-, which was the paradigm of thought for over 100 years!). Heh - what about the contributions of women! From Lady Ada (who really forsaw what computers could be used for - music composition, for instance!), to the original "computers" (sadly, most of them unamed), to Grace Hopper (thank her for the first high level language, COBOL! Among other things). I'm sorry - I find it hard to stop; it's a passion. All of humanity should be dancing in the street at all of this. Most people don't understand what it is that computers are, what they can be, what they may yet become.

...and what they are rapidly encroaching upon.
I will not respond to Arduino help PM's from random forum users; if you have such a question, start a new topic thread.

AWOL

Quote
The original IBM punch cards had 80 columns by 25 rows of data

80 columns, yes, 25 rows, no.
Each column was an EBCDIC character

retrolefty

I've always felt fortunately to have been the age I was when computers were still being designed and build using simple logic gates and flip-flops. The minicomputers of the 70s almost all were built using +5vdc TTL logic chips. So when learning a new model computer one went through the schematic drawings and could see the ALU section, the instruction decoding section, the memory interface, the I/O interface, the interrupt logic, the DMA logic, program counter, flag bits, clock circuit, reset circuit, etc, etc. As any single function was pretty easy to understand when it's been designed with simple logic gates and registers one got a firm understanding about how a computer can do what it does. The complexity of course were in the tremendous number of schematic pages it took to define the complete system.

I think today it's much harder for a newcomer to gain that same incite and knowlege, as the basic processor functions have been more abstracted way to just describing what the functional elements do rather then how those functions are built. That is all now the world of IC design engineering, a rather small and select group.

Lefty

keeper63


Quote
The original IBM punch cards had 80 columns by 25 rows of data

80 columns, yes, 25 rows, no.
Each column was an EBCDIC character


You're right - my mistake... :)
I will not respond to Arduino help PM's from random forum users; if you have such a question, start a new topic thread.

JChristensen

@cr0sh, I share your interest. Interesting post!

@retrolefty, I've always said that kids these days will never understand computers on the level that I (think I) do, as they will likely never have the experience of toggling in the bootstrap loader on the front panel of a PDP-8 (the bootloader was printed as part of the labeling on the front panel).  Or substitute your favorite machine.  Add to that the experience of playing with all those TTL chips.

In fact, this is how I got involved with Arduino, I got them for the kids hoping to give them a little more of the low-level understanding.  I haven't succeeded in that, yet, but the jury is still out.  Unfortunately, I got bitten by the bug in the process  XD

Catcher

@cr0sh
That was a very interesting read! Although I'm a teen, I strive to understand computers and old historical devices as you do. I'm sure it will make me a better computer/electrical engineer. I'm taking your advice; I'm going to play around in assembly.  :D
Thanks

Go Up