I am using PSEINT, a pseudocode interpreter for high school students for starting with algorithmic computational logic, other teachers start their classes with professional IDEs and languages like C++, JAVA, although lately most of them want to teach and learn PYTHON.

For those starting with professional IDEs, one report says they aren’t as successful as using programs like SCRATCH, FLOWGORITH, PSEINT, ARDUINO.



One of the latest features of PSEINT is Unicode support, for example you can replace when typing <=,>=, <>, etc with your standard symbols (UNICODE) ≤,≥,≠
I hope this can be incorporated into arduino IDE

see next animation

Good idea?

Please no. This experiment was tried back in the 70s, with a language called APL. You got a special terminal, including special keyboard, with all sorts of special symbols to do various operations. Every key on the keyboard got an extra symbol, so they weren't limited to well-known math symbols, and then they added "overstruck" operators as well...

It was interactive and interpreted, and was probable designed to displace BASIC. When I was in university (77-81), my school taught APL to the non-engineering, non-science majors (CS majors learned PL/1, other science and engineering folk learned Fortran.) It's big advantage seemed to be that "interpreted" bit - user could load a 'workspace" and have access to high-level functions written (by someone else) for a more specific field (rather the way that Python is used today.)

AFAIK, the extended character set was considered an abject failure. It reduced availability, added complexity, and made code nearly unreadable. :frowning: (now, this is also about the same timeframe that "normal people" (as opposed to would-be clerical staff) were learning to type. So making the keyboard much more complicated probably didn't help.)

Despite the fact that nearly all modern keyboard/display setups are capable of dealing with much broader character sets, I don't think that this is an experiment that needs repeating :frowning:

(Further, I've never heard that the lack of character being a problem that needs solving, with the possible exception of a relative minority of mathematically inclined folks that are really upset by the "confusion" they think is caused by using the "=" "equality symbol" for assignment...)

(Finally, Arduino is at the mercy of the C/C++ compiler and language that it uses. Using unicode symbols in Arduino would require that they first be used in C++...)

Thank you very much for the APL information.

The keyboard of most computers is for writing books, not for computer coding =(

In the animation of above, we see that when you type <>, it automatically replaces it with the ≠ symbol. The option that I purpose is to type how it has been done in ASCII <=,> =, <>,! =, Etc … and then replace it with the corresponding UNICODE symbol, by means of a flag for users who want or not to see the standard ASCII encoding.

All this for seeing a more readable code. I would like even that the arithmetic expressions were entered in 2D, for example the editor of the application ti-nspirePC has a 2D -UNICODE and non-linear program code editor. Most computational languages ​​are written linearly, this because it was very difficult at first to show arithmetic expressions as they are written in mathematics books.

We are in the era of UNICODE, coding has to evolve and more for beginning students who are starting in the world of algorithmics and computing.

ti-nspirePC and ios

Unicode is one thing. Special keyboard routines are a different thing.

The Arduino Uno, Mega, nano, micro and Leonardo all use 8-bit microprocessors which makes old-fashioned ASCII much more practical. I suspect almost all the underlying code would need to change if the use of Unicode was to be the norm. And those changes would seriously impact on the speed at which tasks can be done.

I am able to remember a few special keyboard routines to produce a µ and a ° for example. The idea that I would have to learn more of them would be an appalling prospect.


Unicode is one thing. Special keyboard routines are a different thing.

I am able to remember a few special keyboard routines to produce a µ and a ° for example. The idea that I would have to learn more of them would be an appalling prospect.


You are not alone.
I cannot imagine how many millions of people would have to suddenly learn new things for what is at best a "niche" market.

Let alone the re-tooling required to sell very few oddball keyboards with non standard legends or re-mapping.

(to be fair, it looked like the editor in the animation automatically turned a type "!=" into the fancy character. It's not even clear whether that made it into the source code, or was just an editor/display hack.)

APL, from what I heard the goal was to write a full featured editor in one line of code!

I don't think that the unicode symbols will make source clearer and for those of use with hardened eyes they are just a bit harder to read.

If you want a good teaching language, try Forth at where they have a free copy of Brody's Starting Forth and a Forth to work with. I learned Forth79 in 1983 and it liberated me. It changed how I wrote Basic which back then is what my small business customers wanted so I wrote more and more effective Basic until customers heard about C++. It is at heart OOP but without rails, it compiles in runtime and you can write Forth that writes Forth.

I learned on a VIC-20. 6502 with 5K RAM and an EEPROM cartridge slot. HES Forth79 fit in 8K and it was amazing what I could do in just 5K RAM, more than simply learn.

7-bit ASCII leaves that sign bit open that it's hard to say how many schemes used as a flag. I don't WANT to go to 16-bit text, it's not necessary and it's wasteful. On Arduino it would be worse than String objects.

and you can write Forth that writes Forth.

You can do that with Ruby and Python also. Some of the nerds think it's smart but I remain to be convinced that it is something one should do.

It can be hard enough to figure out the code that a human wrote :slight_smile:


PS ... even when I am the human that wrote it :slight_smile:

You're taking it to an extreme, and C++ can do the same thing with Classes only it takes way more source code.

In C++ you make a class that makes class objects is the first level of code that writes code.