How did you learn to program (Any Language)?

Hi everyone,

I'm starting to get used to Arduino programming, and I have moved onto C++, Javascript, and HTML and CSS. I've only explored the tip of the iceberg for all programming languages, except for Arduino. When I googled "How to program", only because I was bored and actually curious to what people do to learn, the search list showed many tutorials about practicing by reading books or online interactive tutorials, and practice by typing the code over and over until it gets stuck into their minds.

For me, I read books and websites about the language, got a piece of paper, and write down notes of the function of each example, break it down, and then I practice typing the example over and over. It was rare to find a tutorial that referred to writing down notes. My guess is that note taking is unnecessary in programming because knowing the example, how to type it, and experiencing the outcome of the example is what matters more.

I'm just a beginner as of now, but just out of curiosity, to beginners and experienced programmers, how did you learn to program?

I learned BASIC in High School in about 1980. The course was called Computer Math. O. W. Holmes High School in San Antonio.

There were books involved, but mostly we learned from the teacher. We used a sharpie marker or #2 pencil and filled out the lines of code on mark sense cards in baudot code, which is the same patterns used by punch cards. The machine we had could actually read punch cards, but we didn't have any card punch machines. It was a fancy reader that could also read dark black marks. We prepared the cards by hand in class, and stacked them, then took turns feeding the cards in and executing the programs. Some of us stayed after school for extra time on the computer and then we had direct teletype access. The terminal was a wide format dot matrix printer with an integrated keyboard and a 110 baud modem. No screen. We would dial the main computer with a touch tone phone (fancy) and then when we heard the receiving modem squeal, push the handset into a couple of suction cups, and that would establish a session. Then we would login and directly interface using the teletype or feed our cards in. Fun stuff.

The first computer I owned was shortly after that, a Commodore VIC-20 with a whopping 4K of RAM and a 6502 processor. I added RAM and EPROM to it with an external perf board and wire wrap sockets, and wire wrapped one hell of a rat's nest of wires, and it worked great.

I had a UV tube in a cardboard box that I would set the EPROMs in, under the light to erase them, then I would program them with another wire-wrapped board and a BASIC program. Then I could move the ERPOM into its home in the main RAM/EPROM board and use the programs. The next language after BASIC was 6502 machine language (the assembler was me and the data sheet and language reference manual). Then I obtained an assembler program and could write a lot faster, especially after I improved the assembler code beyond the basics. The learning back in those days was from books purchased at the book store and from monthly magazines that I bought at a news stand. And data sheets (paper, not PDF). There was no Internet.

Then on to system 370 mainframe in college, a variety of languages including Cobol, Fortran, RPG II, Assembler...

Back in the late 70s, we copied BASIC programs out of magazines and then tweaked them on a teletype terminal over 300 baud phone modem.
Then in college, had actual classes in programming to understand what we had been doing and incorporating more complex concepts.

Started with basic on a Microbee (Austrailan, 16Mb RAM, cassette storage). Then went to night school.

Not used for professional purposes, purely hobby.

Now using my hobby for small projects at work. Needs to be finished in less than 3mth when I retire.

Weedpharma

I started out with Java, but disliked the syntax, so I switched to C++. My grandfather is an experienced programmer (he worked for IBM many years), and I got some help from him starting out. More recently I started working with microcontrollers. I still consider myself inexperienced with hardware and need much improvement.

I started out with learning BASIC on my first computer, a TRS-80 Color Computer 2 with 16K of RAM. The processor (6809) ran at sub-MHz (789 KHz, IIRC?); a television was my "monitor", and for storage I had a cassette tape (later upgraded to a floppy drive).

This was about 1984 or so; I was 10 years old.

Initially, I used the books that came with the computer to learn with. My parents also got me a few subscriptions to various computer magazines (Rainbow, Hot CoCo, Family Computing, K-Power) that I typed in programs from. There were also various books from the library that I would check out, and make various attempts to convert code over (the Color Computer was very under-represented in the world - so learning how to convert from one BASIC to another was of paramount importance). Sometimes I would be successful, sometimes not - in many cases I would have to do a lot of debugging to get things to work properly.

I learned more from my mistakes than my successes.

One thing that you might find useful is to stick to a single language, and not jump around between languages in your learning. Pick a language, and stick with it. I would personally recommend the Arduino and C/C++ (I wish I could have had this when I was learning - but at the time, a C compiler for the CoCo required a different operating system - Microware's OS-9 - and both together would set you back a few hundred dollars, not to mention you needed a dual floppy drive system, plus 64K of RAM - in short, not on my parent's budget).

The reason I say C/C++ is because most other languages - once you learn the ins and outs of C/C++ - will come pretty naturally to you (especially if you go beyond the simplified microcontroller system - and move more into programming larger applications on a PC).

Anyhow - ultimately if you want to become good at it, expect a long road ahead - one filled with fun times and lots of frustrations too. You'll never learn everything - and you ultimately don't want to stick with only a single language your entire "career" (though if you did, C/C++ is not a bad one in general - you'll always be employed in some regard).

Most importantly - become familiar with the basic structures which almost all programming languages share; in other words, don't focus on the "language-du-jour" - but instead focus on the commonalities; if you do this, in short order you'll be able to pick up and work with any language thrown at you (for most languages - there are some out there that defy explanation and have a cliff-like learning curve - but most of those don't have much following or employment opportunities, if that is your goal - so not much loss).

To this end - pick up a copy of "The C Programming Language" - read it for the understanding, not so much to learn from (as it is very out of date in that regard). Also, you might want to try your hand at learning and understanding at least one assembler language (if you are learning with an Arduino, then AVR assembler might be the ticket) - doing so will teach you a ton about what is really going on under the hood.

Finally - if you have a mechanical bent - look into how player pianos work (particularly so-called "reproducing player pianos") - as well as Jaquard Looms, and perhaps even Charles Babbage's works (oh - and Herman Hollerith's tabulators) - if you do this after gaining an understanding of a CPU, registers, the stack, the heap, etc - how it all works, and more - you'll come to a small and interesting "epiphany" that might help it all fit together.

Honestly - I could go on forever about this subject - you have no idea; but I do encourage you to study broad, and study deep - computation and its history, along with all the side stuff - because it is truly one of the most fascinating subjects humanity has wrought over the centuries...

I was dropped into the deep end, figured out how to dog paddle, then decided I liked it.

My formal introduction to computer programming:

When I was a sophomore, way back in 1971, I took general chemistry for majors. Out of the blue, the instructor assigned an odd homework: write a FORTRAN program to calculate the electron density between any two atoms in a small molecule. We were given 2 weeks and no training in the language, or even physically how to write and run a program (they didn't have PCs - it had to be run on an IBM mainframe.)

Word got around that the way to do it was to find a student who did the same thing during the last year, get a copy of their code, figure out what all those punch cards meant, modify it for my molecule, and take the deck (of nearly 1000 cards) to the computer center, and submit it for overnight batch processing.

Now I find the few seconds for an Arduino program to load and run excruciating!

Study. Implement. Practice...

One thing you should expect to do is work outside your comfort zone. If you were to major in Computer Science or Computer Engineering at a university, you might have a one-semester course "Intro to C++", which might make you about as good/bad a C++ programmer as someone who had "mastered" some Arduino stuff.
But then, you'd take a semester of "Numerical Methods", one of "Data Structures and Algorithms", one on "Microprocessors", one on "Assembly Language", one on "Compiler design", one on "Computer Graphics", one on "User interfaces", one on "Operating systems", and one on "Computer Architectures and digital logic", and maybe one on "Computer networking and communications." Not to mention some physics, chemistry, Philosophy (Logic and boolean algebra, at least) and several semesters of assorted higher maths. You could expect at least one of those classes to use, say, Java instead of C++, even if they assured you that C++ was THE language to learn.

Do well, and you'll get an entry level job, and learn about real-world issues like version control, bug tracking, compiler pricing, license issues, resource contention, real-world libraries, varying code styles, multiple os compatibility issues, and security.

Now, you can find a lot of info on a lot of these online, sometimes as raw info, sometimes as tutorials, sometimes as video recordings of live classes, sometimes full-fledged MOOCs from prestigious universities with homework, projects, grading, and "Certificate of completion." But you have to be willing to, for example, take the Princeton classes in algorithms (which are excellent!) even though they use Java, and you'd really rather use C. Being a "good programmer" doesn't mean "I've used one language and one platform so much that I'm really good at it" so much as it means "I've learned a lot of basics that I can probably apply anywhere (perhaps with some delay), AND I've used at least this one platform/language enough to be immediately useful.

Also, don't forget to learn about the "problem spaces." No one wants a computer program just to have a computer program - the program is supposed to solve some sort of problem. A "good" programmer should understand the PROBLEM as well as how to program. For microcontrollers, that means some electronics, maybe some mechanics and optics.

I did a year of Computer Science in University writing simple assembler programs to run on an IBM360. It was the only thing I found interesting and I failed most everything else so that was the end of my Universty education.

The rest of my programming knowledge is self-taught - Fortran, Basic, Spreadsheets, PIC assembler, Ruby, C/C++ for the Arduino, and Python. Ruby is probably my favourite, but Python is much more widely used.

Much of the early self learning was with the help of Personal Computer World magazine.

To be honest I think I just "understood" programming from day one. (Which is not to claim that I am good at it)

...R

( How DID I learn...)
So, my high school taught BASIC (1976) on a timesharing system. The system had other fun things (games, EMAIL!!!) I thought I wanted to write an EMAIL program, for which BASIC was "not adequate", so I taught myself some FORTRAN and Assembler (there was a tutorial!), as well as looking at Algol and APL. And I read a bunch about microprocessors, most of which were programmed in assembly, hand translated to binary, in those days. Or maybe BASIC if you shelled out for a really expensive personal computer. I obsessively read Byte Magazine, Popular Electronics, and similar, trying to pick out a microcomputer that I could afford and would let me do things that I wanted to do. Meanwhile, the Mainframe was there...

In College, I was an EE major. I thought I wanted to build computers, so I had an "above average" number of courses from the CS department. I took the EE Intro Fortran class and the CS Intro PL/1 class. Then most of the CS classes I listed in my previous note, plus the EE requirements. I got an on-campus job as a "computer operator" for the business school mainframe (bursting printouts, mounting tapes, and helping users. And incidentally all the mainframe time I could use, plus ARPANet access.) I had summer jobs (essentially an internship?) programming in Fortran and/or PL/1 for an Oil company. I added ARPANet email lists to my reading (info-micro, info-cpm, human-nets, etc.) I wrote code far beyond what was required for coursework, eventually becoming "Jr Systems Programmer", fixing code and writing utilities that other people used. I got my first job (mainframe systems/networking programmer) by eMail, essentially. They bought an early IBM-PC as an employee "toy", and since I had done some 8086 programming in school, I wrote code for that, too (A nice comm program for talking to the mainframes. That worked so well that it spurred a somewhat ill-fated commercial venture, that taught me a lot about legal issues surrounding SW development :frowning: ) I did some outside consulting and got paid off with a (my first!) personal computer (a non-PC compatible MSDOS system.) (In retrospect, that project was a first tentative and primitive step toward a WWW-like tool.)
I Switched jobs, went to work for Stanford (still as a mainframe systems programmer.) I took vendor-provided training (tops20 operating systems internals.) You can learn a lot just hanging around at a place like Stanford. I partially audited classes in Cray Assembly Language and Smalltalk. I improved the TCP network code in the mainframe by a factor of 10 or so. I learned more unix. I tried to update the microcode of the 3MB Ethernet to provide a continuous carrier so that it would work over a microwave link (I made "progress", but didn't get too far.) I took a consulting job and updated the MacLisp network-based file archiving system for the latest OS version.

DId I mention that somewhere in there, my favorite mainframe line got discontinued? When most of the people I was working with left to form a startup, and (not too long thereafter) invited me to join them, I jumped at the chance.)
That was cisco Systems. I worked there for 22 years, mostly being the "terminal server and dialup expert", but also working on things like the early low-end routers, new platform ports of the OS, X.25, ISDN (more vendor training!), Novell and XNS, IP Options, and "x86 stuff." I was even a manager for a couple years.

Then I retired, finally leaving me free to re-examine the small microcontrollers and microcomputers that I had always been interested in, but never really had time for. When the Uno came out, Optiboot had bugs and the author had disappeared, so I took that over, and now I'm considered a bootloader expert! And MOOCs. I've been taking MOOCs, rather informally... Reading way too many forums and mailing lists. Trying new things, learning new stuff all the time. It's great!

I have yet to learn an assembly language (still debating which one/ones). It is a pity that Atmel is being bought by Microchip Technology. One fewer option for the hobbyists.

goodinventor:
I have yet to learn an assembly language (still debating which one/ones). It is a pity that Atmel is being bought by Microchip Technology. One fewer option for the hobbyists.

It doesn't necessarily follow that the AVR processors would be discontinued.

Well, Microchip Technology already has its own popular processor architecture: PIC. It sure looks like they bought Atmel primarily to remove competition, not because they thought Atmel had better products. Of course, I can be completely wrong. However, Microchip Technology does already have their own processor architecture for the same range of applications.

I started by learning BASIC and a pseudo-assembly language called CESIL at school in 1975.
By batch-processing, with a two to three day turnaround.

I still have the flashbacks and wake up with the cold-sweats, but the counselling and drugs are easing the symptoms.

Microchip has also acquired other companies and NOT stopped "same-space" product sales (Notably the 8051 chips acquired via the SST merger in 2010.) They've as much as said that they have no plans to discontinue the AVR (of course, that's what they WOULD say, at this point.) (There are whole separate threads on the Atmel/Microchip merger.)
(now, will both the SST and Atmel 8051 chips survive at Microchip? Good question. In general, Microchip seems happy to sell as many "old" chips as anyone is willing to buy. You can still get PIC16F84s!)

I have yet to learn an assembly language (still debating which one/ones).

Better hurry. The educational and documentation infrastructure is disappearing. :frowning:
(For example, the SAMD datasheets (Arduino Zero) don't even contain an "instruction set summary")

westfw:
Better hurry.

Or just buy a few Atmega 328s and put them away for safe keeping :slight_smile:

...R

Robin2:
Or just buy a few Atmega 328s and put them away for safe keeping :slight_smile:

...R

A few hundred ...

ChrisTenone:
A few hundred ...

IIRC you would prefer people to have unused PICs in their cupboard ?

...R

Better hurry. The educational and documentation infrastructure is disappearing. :frowning:
(For example, the SAMD datasheets (Arduino Zero) don't even contain an "instruction set summary")

I have access to the ARM Architecture Reference Manuals. The instruction sets are summed up in those documents along with many other architectural features. I was thinking I should learn x86/x64 assembly (for PC low-level tasks), some form of the ARM Cortex-A series assembly (ARMv7-A or ARMv8-A) for my visual processing tasks, some form of the ARM Cortex-M series assembly (ARMv7-M or ARMv8-M), and of course the assembly language for a basic 8-bit microcontroller for the basic tasks (I'm leaning towards AVR, but then, again, I'm not sure if it will still exist by the time I have learned the architecture).

Robin2:
IIRC you would prefer people to have unused PICs in their cupboard ?

...R

Or as we say, "PICs - your home town microcontroller". Of course, the brand new (and huge) Intel plant is right across the street, so of course I like those too! It's like silicon desert here in Chandler. In Scottsdale, it's more like silicone desert, but that something different.