Pages: [1]   Go Down
Author Topic: Self learn programming?  (Read 1140 times)
0 Members and 1 Guest are viewing this topic.
Offline Offline
Jr. Member
**
Karma: 0
Posts: 72
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Hi. Okay so my school's EE program doesn't put a lot of focus on developing our programming skills, as a result I only know how to write C codes and a bit of MIPS assembly codes. My goal for the summer is to get exposed to more programming languages that is applicable to my major. What languages do you guys think I get familiar with?

1) C++ is at the top of my list as I have never get exposed to OOP and I know it is becoming very popular nowadays. C++ is based on C so I imagine it won't be too time consuming to pick up.

2) HDL - For some reason HDL class is not required for my degree. I imagine it is such a huge and useful topic and I should probably take a class as an elective in school rather than trying to learn it myself...?

3) Web development - I think knowing how to write website scripts is a good skill to obtain. I personally believe the internet is one of the greatest achievement of mankind, and it will only become increasingly important in the future. So I probably should should at least know the basic of HTML/CCS/JS and PHP/Python?

4) I am not sure if there will be any use for an EE major to learn Java...?
5) How about .NET framework languages like C#?

Sorry I ended up typing a huge list of stuff. I can really use some guidance as for how to use my next 3 months wisely to improve my knowledge on programming in general. Any input will be greatly appreciated. Thanks!
« Last Edit: June 26, 2013, 10:51:20 pm by dominicfhk » Logged

Offline Offline
Newbie
*
Karma: 0
Posts: 14
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

C++ is developed as a newer implementation of C. Aside from terms being different, much of the syntax is the same. I was proficient in C (I could get around), and I taught myself C++ by merely sitting down with a syntax-highlighting IDE and just starting to write code.

One of the more popular HDLs is Verilog, and that itself is based on C. Most FPGAs are programmed in Verilog. You will have to look up examples, as I don't know much about how it is actually written, but if you know the structure of C conceptually, Verilog should not be too far outside of the box in terms of knowledge jump.
Logged

Phoenix, Arizona USA
Offline Offline
Faraday Member
**
Karma: 41
Posts: 5610
Where's the beer?
View Profile
WWW
 Bigger Bigger  Smaller Smaller  Reset Reset

1) C++ is at the top of my list as I have never get exposed to OOP and I know it is becoming very popular nowadays. C++ is based on C so I imagine it won't be too time consuming to pick up.

More or less C++ is C with some extra keywords; what you should study are the concepts of OOP - and perhaps event-driven programming. Ultimately, you should gain a grounding in the lower-level software concepts that are applicable across virtually all programming languages, that will allow you to transfer your skills to whatever the "programming language du jour" is.

That said - if you don't intend to go deep on the programming side of things as an EE - then C/C++ knowledge will likely be applicable throughout your career.

2) HDL - For some reason HDL class is not required for my degree. I imagine it is such a huge and useful topic and I should probably take a class as an elective in school rather than trying to learn it myself...?

As djkolumbian noted, Verilog being one of the most popular dialects of HDL - is based around C/C++. I will say, though, that even though this is true, wrapping ones head around how that translates into the actual circuit design (plus all the weird stuff with asynchronous parallel processes and clocks, and everything else that happens in such a design) looks to me (as someone without much formal training in electronics or software development) to be a very difficult thing to do on one's own. That doesn't mean impossible (and for myself, I'm sure I could figure it out on my own - if I had a real need to) - but you might find it easier to learn - or at least develop a base - if you took a course on it.

3) Web development - I think knowing how to write website scripts is a good skill to obtain. I personally believe the internet is one of the greatest achievement of mankind, and it will only become increasingly important in the future. So I probably should should at least know the basic of HTML/CCS/JS and PHP/Python?

If you intend to be an EE in a career, I'm not sure where you would need to develop web sites (or any kind of web application)? You could probably pick any of that up (at least the basics) over time on your own.

4) I am not sure if there will be any use for an EE major to learn Java...?

Again - if you learn the basic concepts of OOP, and you pick up C++ - as well as other fundamental concepts - it shouldn't matter what language it is that you can potentially learn, as long as you understand those underlying concepts. Unfortunately, to get familiar with those concepts takes either pursuing a lot of what goes into a CS degree, and/or reading and understanding similar material (as well as working with a ton of languages). I'm fairly familiar with a lot of those concepts, despite not having a CS (or any real) degree - but only because I have read a ton of CS-oriented books, papers, etc - plus tons of programming magazines, web sites, looking at tons of code, programmed in several dialects of different languages (including various microprocessor assembly languages - I also have minor experience doing hand-assembly of hex codes for 6502 assembler), etc. In short - this knowledge has been won the hard way; but I don't think there is an easy way, either.

5) How about .NET framework languages like C#?

Again - see above. Ultimately, most languages you will encounter are based (at least in syntax to a large extent) on C/C++ - sometimes with a bit of Pascal thrown in. Many pull concepts and ideas from languages you have never heard of (some of which only were popular among the military/DOD - or were academia languages that never saw much work outside of the university setting). All (well - most) rely on very basic concepts that are common across all languages.

Another language you might find useful to learn from an EE perspective would be Matlab (or it's open-source cousin, Octave); it has a fairly easy to understand syntax, but its main primitive variable types are based on linear algebra (vectors and matrices). As such, if you understand how to use such primitives for calculation in a parallel manner, you can use such languages for parallel processing (provided your problem can be reduced to such an algorithm). Indeed, both have interfaces and means to allow you to launch and run your programs from both simple multi-core CPU machines, to large-scale clustered computing systems (should you have access to such a system, and a problem that needs such computational power). I'm not sure whether, as an EE, you can use such a language, but it might be worth looking into...

Sorry I ended up typing a huge list of stuff. I can really use some guidance as for how to use my next 3 months wisely to improve my knowledge on programming in general. Any input will be greatly appreciated. Thanks!

If you are interested in software and hardware, and how it all intersects and interacts - you might do well to consider studying the history of computation. Our concepts of the computer as a symbolic processing machine - as opposed to merely a calculator - really didn't come about until the work of Alan Turing and Alonzo Church were published in the 1940s and 50s. Prior to that, things were much different conceptually.

There's also the concepts behind Universal Turing Machines that might help you to understand computers and computation general - and how this all relates to modern software development; it isn't necessary, but it can be fascinating to see where things come from. There do exist languages based on such minimal processing constructs (whitespace and brainf*ck being two of the most known). There are also old "games" like "Core Wars" that can help you learn simple assembler concepts while being interesting to play with. Also - the whole concepts behind Cellular Automata - and if you delve deep into that, you start dropping into evolutionary algorithms, eventually into artificial life concepts - then on to artificial intelligence and machine learning, etc. Wolfram's book "A New Kind of Science" delves deep into the ideas of Cellular Automata acting as Turing Machines. You might even be brave enough to see the parallels with biological processes, DNA, RNA, mRNA, etc...

Then there are weird things like:

http://en.wikipedia.org/wiki/One_instruction_set_computer

...and:

http://en.wikipedia.org/wiki/1-bit_architecture

The thing is - if you can understand the hows and whys of all of the above - you'll not only gain an appreciation for the higher levels of "abstraction", but a better knowledge of how it all works.

As you can probably tell, I have found all of this stuff utterly fascinating over the years (decades!) - computing in general has fundamentally altered how we think and understand so much in our world; it has literally touched every aspect of our lives, and has indelibly altered our perceptions and society at large. Furthermore, it is continuing to do so - we are no where near the end of this yet.
Logged

I will not respond to Arduino help PM's from random forum users; if you have such a question, start a new topic thread.

Pages: [1]   Go Up
Jump to: