Best programming language for beginners?

BulldogLowell:
Computer programmers like to stand behind the scary, fiery wall like the Great Wizard of Oz. Warding off folks, lest they discover what's behind all that fire and smoke.... just some dude pulling levers.

There's a priesthood with jargon and all, and boy oh boy they HATE when outsiders fix their code and beat them to contracts.

The problem with BASIC is that it has little standardization. A modern BASIC (say, Microsoft Visual Basic, or RealBasic) has most of the features that a modern programmer should like, has a compiler that produces fast code, and has a good IDE with powerful debugging features.) But those tend to lack "embedded" features. Other BASIC compilers are specifically aimed at microcontrollers, but you're likely to think that you've been handed a completely different language than those "Desktop BASICs" (you have.) Parallax Stamp PBASIC is horribly primitive and slow, for instance (but they make up for it by including "embedded programming functions" as language keywords (like "shiftout")

Pascal was indeed designed as a teaching language. But that was back in the mid-1970s, and it's been a bit neglected since the 1980s. To be useful, it needs "extensions" beyond the basic language, which are somewhat poorly standardized. I became disenchanted with Pascal when I realized how much it cheats - the language has features and syntax that a user can't duplicate (at least, not without more 'extensions.') And the extent to which it seemed to deliberately limit itself to be ONLY a "teaching language" (not that that lasted. For a long time, a lot of the Apple system and application software was written in a Pascal-like language.)

When considering a "learning language", you need to consider the teaching and learning resources available. And your "goal" - a lot of "learn XXX in 5 weeks" sort of programming instruction (including Arduino) is about learning to do some useful things as quickly as possible. That's great, but you miss out on principles. A lot of "University introductory Computer Science" classes are big on principles and theory, but essentially only prepare you to take the next class, where you'll learn more details (and eventually, how to do something more useful than a class assignment.) (and the truth is, that even after getting through a 4-year CS degree, your actual programming skill may be "meh" by industry standards, especially if you haven't forced yourself to do some major "projects" outside the scope of the usual assignments.) And University teach different languages depending on whether you're going into theory (CS) or into actual problem solving (EE, ME, Most of the sciences, etc.) (At one of the colleges my daughter looked at, they were still teaching Fortran! To Physics or ME majors, IIRC.

So... universities today seem to be teaching Java, Python, and C++ as their "intro" languages. I've take a bunch of online classes (MOOCs.) There have been some very good classes using Java and Python. The only C++ class I took was ... pretty awful. There was also the UTexas "Embedded Systems" class that used C (and taught some "embedded C constructs, but expected you to already know the basics.)

There's also the fact that "beginning" classes are NOT going to teach you everything there is to know about a language. One of the complaints about all three of those languages is that they're HUGE, with MANY FEATURES and EVEN MORE LIBRARIES. (as opposed to, say, C, which is really tiny (but still has lots of libraries.)) Some of the features seem to be obscure nods to some tiny corner of some unknown discipline, rarely used unless YOUR PROFESSOR happened to like them (or someone where you work.) The way you learn about these generally involves coming across them in published code, and going "WTF?" and then figuring them out...) Then you can either decided that they're useful, or the product of a deranged mind who shouldn't have been allowed near a compute. (There was a post recently where someone had used C++ operator overloading such that "a = b + c;" changed b and c. Shudder.)

Finally, it doesn't really matter all that much. If you learn C++, but the next class you take is "Data Structures and Algorithms using Java", you'll have a few things to catch up on, but you're not going to be completely lost. There are a lot of similarities between languages; you find pieces that you like better, pieces that you like worse, in one or another, and it may influence what you choose to use for your personal programming. But professionally, you're more likely to have that dictated by your employer, and it won't be THAT uncommon for an prospective employer to expect their language choice to be irrelevant. If you've done GPS data logging in C# for Windows Phone, the "GPS data logging for a phone" is likely to be a more important piece of the hiring decision than the "C#" part; you'd be expected to be able to do something similar in Java for Android phones without too much additional effort.

x=3.14159
print x
3.14159
i%=x
print i%
3

Sometimes it's just about playing with the blocks directly, getting to know the shapes.
Structure? There are concepts that don't even need code to demonstrate directly, instantly.

We do get people on the forum who are semi-clueless mimes when it comes to variables despite writing scores of lines of well it runs code. That's just success waiting for help to happen, right? It it wasn't waiting, it would have happened.

westfw:
Pascal was indeed designed as a teaching language. But that was back in the mid-1970s, and it's been a bit neglected since the 1980s. To be useful, it needs "extensions" beyond the basic language, which are somewhat poorly standardized. I became disenchanted with Pascal when I realized how much it cheats - the language has features and syntax that a user can't duplicate (at least, not without more 'extensions.') And the extent to which it seemed to deliberately limit itself to be ONLY a "teaching language" (not that that lasted. For a long time, a lot of the Apple system and application software was written in a Pascal-like language.)

I still have my 1985 Byte issue where Niklaus Wirth wrote that to get a program right you write it twice and throw the first one away. Then he wrote that Pascal is that first one, Modula is the one to keep. And who am I to argue? I've helped fix Pascal homework because logic is logic but I swear that language is more anal-retentive than COBOL. I never tried the next step into that.

GoForSmoke:
I never tried the next step into that.

Sheesh - I was getting all psyched up for your Modula introduction :slight_smile:

...R

Sometimes it's just about playing with the blocks directly, getting to know the shapes.

I dunno. Can you actually do loops and stuff "interactively"? Certainly not in some of the microcontroller BASICs.
BASIC was my first language, and I've even written BASIC for work (a long time ago) (ok: BASIC and x86 machine language!) I don't remember ever doing much with "interactive mode" (debugging perhaps?) - maybe the interpreters I used didn't have that feature? In any case, that's just a glorified calculator - part of programming is planning the whole thing ahead...

(I had to look at i%=x for quite a while before stopped thinking "modulus? BASIC doesn't do THAT!" :slight_smile: )

(Hopefully you don't have to do LET x=3.14159 )

(Heh. LISP and FORTH have fine interactive interpreters as well, but I'm not going to recommend them either. Python does pretty well as an interactive interpreter:

>>> x=3.14159
>>> print x
3.14159
>>> i = int(x)
>>> print i
3
for i in [x/6, x/4, x/3, x/2, x]:
...   print sin(i)
... 
0.499999616987
0.707106312094
0.866024961519
0.999999999999
2.65358979335e-06
>>>
)

westfw:
Python does pretty well as an interactive interpreter:

You can do the same sort of thing with Ruby and many of the introductions to both languages use the interactive interpreter. However I have never been able to see the point of it myself.

If I want to do calculations I have a calculator.

If I want to write a program then I want to save it in a file so I can use it over and over - or amend it if I've got it wrong.

...R

BulldogLowell:
I studied Latin in school. It's great, Today I can speak it with absolutely no one.

Apples and oranges. The first commercially useful program I wrote was in BASIC and VB is probably the language I have made most money from. I did some assembler and C/C++. Then I had to do a couple of projects in Delphi and it brought clarity, to what I had been hacking away at for years. Type, scope and OOP particularly, made so much more sense. When I went back to C++ it was so much easier to write clear, bug free code, first time around. When I switch to another language or dialect, the important lessons are carried from Pascal and just have to learn some new keyword and fit within the syntax of the constructs provided.

Robin2:
Tosh!

Sounds like the 1950s classroom where Python belongs. Where kids were punished for writing with their left hand, just because. Indentation has no computational value. It should not cause a compiler to choke. Indentation is for humans and should remain in the human domain. If it is that important, have the editor impose it automatically. While you are still trying to learn what a variable is, worrying about whitespace is a distraction nobody needs. Clearly this is a personal opinion :wink:

Once you learn how to use BASIC you'll never use it again.

Nonsense!!!

I have a Toshiba T1200 laptop with an LCD screen from about 1988 that runs QBASIC in DOS 3.3.

I still use it to collect data for my weather station. A 720K floppy disk holds months of data, and the laptop runs for hours on 6xAA batteries if the power goes out!

Also keeps a nice plot of the last ~20 hours of average wind speed and gusts on the screen.

For the BASIC aficionados among you, here is the wind gauge code:

REM read wind gauge and store statistics on disk.

REM circular buffer push and pull
DECLARE SUB gbuf (array!(), v!, nsp%, nsmax%)
DECLARE SUB pbuf (array!(), v!, ns%, nsp%, nsmax%)

REM plot wind speed data in box 1-600 on x, 0-99 on y, lower screen
DIM t0(600), t1(600)
DEFINT I-N
DEFSNG K

nsec = 300 'sample save interval in seconds

SCREEN 2 '640x200
'
' VIEW defines graphics viewport in absolute screen coords, subsequent
' pixel coords are relative to view window
'
' VIEW SCREEN  (ditto) but subsequent pixel coords are absolute
' Order of coordinate pairs is immaterial!
'
VIEW (21, 98)-(620, 198), , 1: REM view won't frame extreme edges
WINDOW (1, 0)-(600, 99): REM logical coords 1-600 on x, 0-99 on y
CLS

OPEN "com1:4800,n,8,1,cs0,ds0,cd0" FOR INPUT AS #1

LOCATE 1, 1: PRINT "Starting wind monitor at: "; DATE$; " "; TIME$
LOCATE 2, 1
INPUT "Output file name: ", n$
OPEN n$ FOR OUTPUT AS #2

PRINT #2, DATE$; " "; TIME$; " sample:"; nsec

r% = 0
REM number of points in graph, buffer pointers
n0m = 300: n0 = 0: n0p = 1
n1m = 300: n1 = 0: n1p = 1

REM get version #

'ON ERROR GOTO handler

LINE INPUT #1, a$
LOCATE 3, 1: PRINT "Version and time: "; a$
LOCATE 4, 1: PRINT "Hit Esc to exit..."

nsamp = 0: wmaxt0 = 0: avgt0 = 0: vart0 = 0

ON TIMER(nsec) GOSUB 5000
TIMER ON

10 LINE INPUT #1, a$: LOCATE 9, 33: PRINT a$; "     "
	nc = INSTR(a$, ","): a$ = MID$(a$, nc + 1)
	nc = INSTR(a$, ","): a$ = MID$(a$, nc + 1)
    
15      temp0 = VAL(a$) / 22.6
	nc = INSTR(a$, ",")
20      temp1 = VAL(MID$(a$, nc + 1)) / 22.6: REM max reported by sensor
       
	IF (wmaxt0 < temp1) THEN wmaxt0 = temp1
	avgt0 = avgt0 + temp0
	vart0 = vart0 + temp0 * temp0
	nsamp = nsamp + 1

	LOCATE 11, 26

PRINT USING "& Avg: ### Max: ### mph"; LEFT$(TIME$, 5); temp0; temp1;

	IF INKEY$ <> CHR$(27) THEN GOTO 10

REM escape key hit, all done

GOSUB 5000
CLOSE #2
STOP

5000 r% = r% + 1: REM write record: average, max and s.d. this interval
avgt0 = avgt0 / nsamp
vart0 = vart0 / nsamp - avgt0 * avgt0
vart0 = SQR(ABS(vart0))
PRINT #2, USING "###.#,###.#,###.#"; avgt0; wmaxt0; vart0
CALL pbuf(t0(), avgt0, n0, n0p, n0m)
CALL pbuf(t1(), wmaxt0, n1, n1p, n1m)
nsamp = 0: avgt0 = 0: wmaxt0 = 0: vart0 = 0

REM plot data

CLS 1

' dotted gridlines
FOR i = 1 TO 600 STEP 4
PSET (i, 30): LOCATE 21, 4: PRINT "10";
PSET (i, 60): LOCATE 17, 4: PRINT "20";
PSET (i, 90): LOCATE 13, 4: PRINT "30";
NEXT i

' pull out most recent temp0 and max measurements

nsp = n0p
FOR i = n0 TO 1 STEP -1
CALL gbuf(t0(), v, nsp, n0m)
v = 3 * v: i2 = 2 * i
PSET (i2, v): PSET (i2, v + 1): PSET (i2, v - 1): PSET (i2 + 1, v): PSET (i2 - 1, v)
NEXT i

RETURN

handler:
LOCATE 3, 30
PRINT "error "; ERR; " at line "; ERL; TIME$; " "; DATE$;
LOCATE 4, 30
PRINT a$
RESUME NEXT

DEFINT K
SUB gbuf (array!(), v!, nsp%, nsmax%)
DEFINT I-N
REM NO check for buffer underflow
v = array(nsp)
nsp = nsp - 1
IF nsp < 1 THEN nsp = nsmax
END SUB

SUB pbuf (array!(), v!, ns%, nsp%, nsmax%)
DEFINT I-N

REM returns ns%, number of values pushed
array(nsp) = v
nsp = nsp + 1: ns = ns + 1
IF nsp > nsmax THEN nsp = 1
IF ns > nsmax THEN ns = nsmax
END SUB

MattS-UK:
Indentation has no computational value.

That may be true of the visual part of it. But I can't see any technical difference between starting a block of code with a TAB character or starting it with a '{' character. And it saves me having to type the closing '}' And as a side effect it makes the code easy to read

Get over it :slight_smile:

...R

have the editor impose [indentation] automatically.

And of course, Python editors DO that.

I was a little surprised at how little the "whitespace issue" bothered me when using python. I mean, it's only INDENTATION that matters; other whitespace is still malleable.

(Hmm. Look at it this way: it's not the first time when a "recommended practice" has become a "required behavior of the language." The other main one I can think of is pre-declaring your variables (C, Pascal, PL/1...) (something that Python got rid of...))

westfw:
(Heh. LISP and FORTH have fine interactive interpreters as well, but I'm not going to recommend them either. Python does pretty well as an interactive interpreter:

>>> x=3.14159
>> print x

3.14159

i = int(x)
print i
3
for i in [x/6, x/4, x/3, x/2, x]:
...  print sin(i)
...
0.499999616987
0.707106312094
0.866024961519
0.999999999999
2.65358979335e-06

)

My first Basic let me use bits in integer vars. I consider learning bits early to be important foundation knowledge.
Is Python deficient in that department?

I think the real beauty of Basic for Beginners is that they can get utterly sick of it before too long and move on to better. If you start on a hilltop that's going to be harder to do.

Robin2:
That may be true of the visual part of it. But I can't see any technical difference between starting a block of code with a TAB character or starting it with a '{' character.

Was that some sort of pun. :wink:

I can see '{' and '}' characters but I can't see a TAB character, or any other white space or control characters.

Get over it :slight_smile:

I am over it, but from a beginners perspective, I think it is better to be encouraged to consider the purpose of adding white space to increase human readability, than be taught to slavishly please a compiler.

I bet you just loved learning tables by rote too :smiley:

The other main one I can think of is pre-declaring your variables (C, Pascal, PL/1...) (something that Python got rid of...)

Pre-declaring variables in Pascal, teaches you how a procedure, function or unit is arranged in memory. It makes it very easy to see where and why variables fall out of scope. Similarly Pascal type declarations, which can seem unwieldy but you soon get your head around type equivalency.

Not that type and scope are important computing concepts in any way :smiley:

MattS-UK:
I bet you just loved learning tables by rote too :smiley:

Yeah. A nice sense of achievement :slight_smile:

...R

Hi,
First computer language was psuedo-code to introduce HPBasic. (Secondary School)

The only computer you needed was your brain, the input device, eyes and blackboard , output device hardcopy pen and paper, RAM another piece of paper.

Even learnt flowcharting.

HPBasic was on cards you used a HB pencil to cross out squares to optically encode the card.
Turn around time three days, one day on the bus to the computer, one day for program run (if it did), one day of bus to bring it back.
So mistakes wasted very valuable programming time at school.

By the way does anybody remember what 1H1 in a fortran print format statement did, especially if it was trapped in a loop?
Apart from raising the blood pressure of the Computer Centre Manager.

Tom... :slight_smile:

Is that the Halt Catch Fire command?

GoForSmoke:
Is that the Halt Catch Fire command?

Just about.
It tells the lineprinter to go to the top of the next page.
If its trapped in a loop, you have fanfold paper cascading, in a magnificent arc, out the top of the lineprinter.
:o :o :o :o :o

TomGeorge:
you have fanfold paper cascading, in a magnificent arc, out the top of the lineprinter.

Presumably so students have free paper to write their lecture notes on ? :slight_smile:

...R

I would personally suggest Python. Start off with python 3 not 2, there are quite a few small differences and as python 3 is the current version under development, start with that. But like J-M-L said, learn the syntax structure and the logical flow of programming. DONT learn basic as your first language, it is outdated and lot of elements found in new programming languages are missing. Some may argue that starting with a dynamically typed language like python (basically you make up stuff as you go along,it's not as strict as a statically typed language like c,c++,java) might make you lazy but I find it a lot of fun. Python also sports a lot of libraries which simplify and make stuff easier and more fun (a particular favourite of mine is tkinter, It makes building GUI interfaces very easy, like 10 lines easy). Despite it being an interpreted language (which does have it's benefits) it's adequately fast. Besides why not try learning Python and c or c++ together. The choice is yours.