Pages: [1] 2   Go Down
Author Topic: Big Picture Question - Arduino as a sort of "computer" ?  (Read 1368 times)
0 Members and 1 Guest are viewing this topic.
Offline Offline
Sr. Member
****
Karma: 4
Posts: 329
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Okay, i'm sort of getting my mind blown away here; or rather having the veil of ignorance being lifted from my mind's eye !

It's quite a general topic, and i decided that this General Electronics section would be the proper place to put it seeing as Electronic Engineering is the foundation (or forebear) of Computer Science/Engineering anyway.

I caught a thread about data-logging with the Arduino and it started me thinking about what else the Arduino can do.
I'm aware of the Ethernet shield with built-in microSD card storage (which i've not started tinkering/playing with) and then got to thinking, one really doesn't need Ethernet just to store data - so what other ways are there ?

That thread mentioned GoBetwino which i looked up and then realised "how deep this rabbit hole goes" !!
So GoBetwino is for Windows, and there must be some more generalised picture to view all this, i got to;
http://playground.arduino.cc//Main/InterfacingWithHardware

Meaning the Arduino can interface not just with inputs, outputs as these Starter Kits deal with, but User Interface & Storage - so we're talking "monitors" (LCD screens, etc) and "storage" like hard disks and keyboards (keypads).

The question then is, could one (theoretically, possibly impractically) rig up the Arduino as some kind of "mini*-computer" with a screen, keyboard, hard disk and Arduino as the "CPU" ?

Wouldn't that be a more comprehensive learning experience than say the Raspberry Pi which sets everything up to be 'just plug and play'-easy (relatively speaking) ?

I guess it depends on what level of learning about computers one wants to focus on, the Arduino environment is already making things "easy" compared to banging out '1's & '0's on chips...

I had some experience with BASIC programming long ago, without any electronic engineering knowledge, and am only now piecing it all together.... i didn't even realise until recently that AND, NAND, OR, etc logic gates weren't even actual discrete components !!

I suppose the point of this thread is just musing out aloud, and maybe getting others' thoughts on this "Big Picture" - perhaps we could plot some kind of generalised (hierarchical?) map covering all the related knowledge on Computers, Electronics, and even down to Physics (the actual electrons themselves !)

Something like;

COMPUTING;
Programs
Operating systems
High-level languages (C/C++) ?
Low-leve languages (?)
Assembly Language (these would be specific to a chipset, right ?)
ELECTRONICS;
Logic gates ?
Components (resistors, capacitors, diodes, etc) - whoops, transistors !!
ELECTRICAL THEORY;
Ohm's law
Capacitance ?
Inductance ?
PHYSICS;
Electro-motive forces
Electron flow ?
(QUANTUM PHYSICS ??!)

Any ideas on how to shape this properly ?



* = i'm aware that the term "mini-computer" actually refers to something much bigger (!) , bigger than a micro-computer which itself is bigger than a "personal computer" - or at least that's how i've understand "computers" through history.
Logged

Offline Offline
Edison Member
*
Karma: 33
Posts: 1435
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Go for it.
Logged

Lacey, Washington, USA
Offline Offline
Edison Member
*
Karma: 155
Posts: 2344
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

The Commodore 64 was only a 1MHz CPU, although there was no GUI. It was a bit like a multiple CPU, however, as not only was there a 6502 in the C64, each floppy drive had its own 6502. And there was GEOS, a crude (by today's standards) GUI.

Certainly something like the Raspberry Pi makes it a lot simpler as it has things like HDMI video and an SD card reader built-in already. And people -are- building Linux computers using them.
Logged

Steve Greenfield AE7HD
CET Consumer Electronics and Computer
Please don't read your attitudes into my messages

Offline Offline
Edison Member
*
Karma: 33
Posts: 1435
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

You should probably stick Maxwell in there, too. though quantum mechanics and relativity kind of encompass that, since magnetism is a relativistic effect of the electric force.
Logged

Phoenix, Arizona USA
Offline Offline
Faraday Member
**
Karma: 40
Posts: 5570
Where's the beer?
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

You're missing "machine language", which is a level lower than assembler; also, a "micro-computer" is not necessarily "bigger" than a "personal computer" - those terms really just serve as a rough time-line of systems; relative sizes don't mean much. You also forgot the term "home computer" - which was a period between "micro-computers" (1970s hobbyists) and "personal computers" (mid-1980s business, mainly). There's a ton of overlap between the three genres; it wasn't until the early 1990s that the PC clones began to dominate the small computer arena (even there, it would be a while before they would completely dominate - they still had to contend with the small Unix systems of Sun, SGI, and IBM, among others).

Beyond that, I'm also not sure if you grasp the concepts of what exactly a CPU is or does - exactly how it works (that is, at a very base level). Do understand what micro-code is - and what it does? Do you have the concept of a CPU (and memory) being analogous to a player piano (indeed - if you look back at the history of player pianos - you'll find some very interesting hardware - especially in the so-called "reproducer" machines of the early 1900s - heh)?

Have you studied the history of computation? Do you know of the contributions left by Norbert Wiener, Alan Turing, Charles Babbage, and Herman Hollerith (just to name the barest few)? James Watt? What of Jaquard? What of automata? What of Descartes, and Blaise Pascal? Leonardo Da Vinci? For that matter, what of ancient Greek, Chinese, and Arabic cultures, and their contributions?

The history - the true history - of computing is as old as humanity - arguably as old or older than the dream of flight. We have struggled for millennia toward a goal which we are only now just reaching: That of creation of being in our own image.

Computation is a part of that. Robotics is a part of that. Mathematics is a part of that. Artificial Intelligence is a part of that.

The rabbit hole goes deep - it winds through history and ages past like you wouldn't believe. The richness of the tale woven through the centuries has not yet concluded; it is one filled with majesty and triumph, as well as sorrow and defeat. Heros and villains fill it throughout. It is a tale worth knowing, even at a superficial level, in order to understand how we have gotten to this stage.
Logged

I will not respond to Arduino help PM's from random forum users; if you have such a question, start a new topic thread.

Anchorage, AK
Offline Offline
Edison Member
*
Karma: 42
Posts: 1176
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Wow man, that was poetic.  Some day I intend to create my own hobby blog site to document all the things I had to learn the hard way.  Do you mind if I (eventually) quote your post there?
Logged

Global Moderator
Netherlands
Offline Offline
Shannon Member
*****
Karma: 216
Posts: 13666
In theory there is no difference between theory and practice, however in practice there are many...
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Cr0sh +1
Logged

Rob Tillaart

Nederlandse sectie - http://arduino.cc/forum/index.php/board,77.0.html -
(Please do not PM for private consultancy)

Offline Offline
Edison Member
*
Karma: 33
Posts: 1435
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

And just to not be sexist, throw in Lady Ada Lovelace and Adm. Grace Hopper.
Logged

SF Bay Area (USA)
Offline Offline
Tesla Member
***
Karma: 132
Posts: 6746
Strongly opinionated, but not official!
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Quote
perhaps we could plot some kind of generalised (hierarchical?) map covering all the related knowledge on Computers
I doubt it.  I mean, you went 'down' from 'programs', but there is also 'up': algorithms, math (in all is assorted varieties), linguistics, robotics, artificial intelligence, philosophy, provability, game theory, numeric analysis, cryptography, systems management, Information technology, databases, networking...  Any one of which can consume a full career...  (and I wouldn't say that they're all hierarchical, either.  Things branch off in random directions.)
Logged

Offline Offline
Sr. Member
****
Karma: 4
Posts: 329
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

wow, thanks for all the comments, guys !
it is indeed a BIG picture !!
perhaps am trying to bite off a bt too much.
getting the timeline into it is probably essential to be able to match everything (or most) together.

thanks again - it helps to fill in the gaps that i'm unaware of, and helps me arrange the "pieces of the puzzle" a little better.
hopefully i can update something that looks more presentable in the (near?) future.

the 'up' is probably what concerns most people (in Computer Science) now - the abstraction of actual program code into their concepts - ie. the algorithms - this is where i stumble - having learned BASIC without knowing about State Machines.
Logged

Offline Offline
Newbie
*
Karma: 0
Posts: 37
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

I KNOW EXACTLY WHAT YOUR SAYING!! i am in the same boat as you, i see these people making all sorts of tools and gizmos and think to myself "you can actually make anything with this".

i have seen arduino based:
servers
drones
signal amp towers
pong
Arcade machines
garage door openers
interactive and automated gardens
remote flame throwers
security devices
tazers
3d printers
laser guided rovers
pvc submarines
microwave ovens
geiger counters
smart phones
blimps
lawn mowers
and so much more

this is just off the arduino board. its not off of other development boards like raspberry pi, beaglebone, ect

now what i was thinking is try to learn off of it by starting a project that uses all of this knowlege to make a completly networked group of machines that run off data given by the user and data retrieved useing tools. when reading it, you think "thats too complex" but in theory it really is not. arduino makes it possible to do anything.  so why not be the one who does it? maybe even start a company off of it, make a small profit so you an achive more broadend goal.

-Founder of https://turttech.com
Logged

Offline Offline
God Member
*****
Karma: 2
Posts: 713
a, b = b, a+b
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

reinvent the wheel much?
Logged


SF Bay Area (USA)
Offline Offline
Tesla Member
***
Karma: 132
Posts: 6746
Strongly opinionated, but not official!
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

It might be worthwhile pointing out that the Arduino with its 28-pin microcontroller DOES have capabilities similar to or exceeding early "general purpose" personal computers.  The original Apple ][, for instance, apparently shipped with 4k of RAM and 4k of ROM, running on a 1MHz CPU.  Personally, I used to want a Cosmac Elf, which defaulted to a whopping 256bytes (or up to 2k/8k) of RAM.
The Timex/Sinclair ZX80 had 2k of RAM and about 8k (?) of ROM, and ran much slower than a 16MHz AVR CPU.
Logged

Offline Offline
Sr. Member
****
Karma: 4
Posts: 329
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

reinvent the wheel much?
not so much reinventing but recording how it was invented.
if you have a clear view of such a Big Picture, good for you.
Logged

Offline Offline
God Member
*****
Karma: 2
Posts: 713
a, b = b, a+b
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

yeah, that makes a lot of sense. working with Arduinos has helped me understand this process as well :-)
Logged


Pages: [1] 2   Go Up
Jump to: