What Is Your Stupidest Coding Mistake?

What is your most embarrising / just plain duh coding mistake?

Let me start:

I wrote some simple code for a 10 led crossfade. Every time I turned the pot to the far left, no LEDs would come on... It turned out that in my map function, I had mapped it to 0 - 10, and the leds were 10 if statements for if the map equaled x(1-10) turn led X(1-10) on. The problem was that far left was 0...

That's my duh moment, now what's yours?

Did a loop with the for command

for (int i; i<10; i++);
{
  //stuff here
}

And it just totally ignored it... took me quite a while to notice the ; in the end of the for()...

For ludm dare 15 (maybe 16 dont feel like looking it up) the theme was caverns so I came up with a concept that was kinda sorta like cosmic avenger and defender but in a cave (keep in mind this is a make a game totally from scratch by your self in 48 hours competition)

to "simplify" things I choose to use vector grapchics, this allowed HUGE maps in small size since they were just point arrays

Once I had all that hammered out against the limitations of the framework I was using (old Love2d) I could only use the physics engine against the ship object, but not against the nearly 10 min long map

I fought with the ships collision against the line segments of the map up until about 14 hours left in the compo, finally breaking down and about to admit total defeat I asked on the love 2d fourms

where everything I needed to know was broken down to about a quarter line of 4 function math code

I still made an entry, though I would not call it a game, complete with a "original" music and sfx

http://www.ludumdare.com/compo/2009/08/30/cavern-crawler-final/

my stupidest mistake? not asking sooner

I had to split a unsigned integer into two bytes..

bytea = (byte)myint%256 ;
byteb = (byte)myint/256 ;

after getting the upper byte consistently back as 0
it dawned on me that the cast does not operate on the expression
but only on the int.
so byteb = (at most 255)/256 = 0

Back in the old days, I actually managed to write some fortran code that redefined a constant...
(fortran subroutines passed all parameters "by reference." So if you did "call foo(0, myvar)" the compiler would create a memory location containing 0, and pass the address to the function. If the function happened to modify the value of the parameter, the memory location containing 0 would then contain something else. But the compiler was smart enough to SHARE references to constants...)

Something I do shockingly often is this:

for (int i = 0; i>9; i++) {
//code here
}

And I'm completely baffled as to why it doesn't work. Then I realize I accidentally did < instead of > or vice versa;D.

Not sure if this qualifies, since it wasn't caused so much by coding, but by not checking a command I had typed in before hitting return...

IIRC, I was 18 or 19 years old, working as a developer for my first real employer. My "supervisor" was the lead developer for this shop (small mom-n-pop type dev shop), which developed a small insurance claims management system for our Arizona's public health care system (AHCCCS).

One of our clients at that time was using our system in a "heads down" fashion, entering claims manually on RS232 serial VT100-emulation terminal sessions on 386 PCs (showin' my age!). We had three database tables involved for these claims; a claim table, a transaction table, and a line item table.

At the time, we had nothing like a live and test environment; in many cases, testing and development was done on the production system (big mistake, I realize in hindsight). So, I was given the keys to the kingdom to develop this system, which involved me doing some kind of data manipulation to these tables. However, since I couldn't use the live data, I had to use a copy of the tables. So my steps were:

  1. Copy the three tables to new names, which were referenced in my code.
  2. Try the code out.
  3. Check the data in the tables - was it right?
  4. If not right, delete the testing tables, go back to step 1
  5. If right - yay, I am done!

Step four was the "gotcha"; the tables had names like "CLAM_CLAM" (claims), "CLAM_TRAN" (transactions), and "CLAM_BENE" (benefits or line items); to this day I can still remember their names!

Anyhow, my test tables I called something like "TEST_CLAM", "TEST_TRAN", and "TEST_BENE". Well, I got to step four after many rounds of testing all day, and instead of typing "DELETE TEST_????", I typed "DELETE CLAM_????", on one of the tables (can't remember which, off-hand).

Well - as soon as I hit return, I realized what I had done; all my co-workers heard was me moaning loudly "NOOOOOOOOOOOOO!!!!!", and before anyone could ask me what was wrong, all phones in our office started ringing; every...single...one. This included my supervisor's, as well as the owner/president of the company. I had single-handedly caused our client's entire claims entry division for that part of the company (which at the time was one of the largest insurance providers in Arizona) to stop. Not only that, but I had managed to delete all the claims from that day, back for a couple of years.

After the dust had settled, and both me and my boss got chewed out (I imagine he got the worst of it - ie, why was he letting a kid delete records on a production system), we managed to recover from backups all of the data up to that day's work. To recover the rest of the data (that day's work), my boss and I sat down, designed and wrote a recovery tool that used the other two tables (that I hadn't deleted) to re-create the data in the deleted table, and what couldn't be recovered (which was a very minor amount of the total), the client had to re-key in from their paper claims. We didn't go home that night until well after midnight.

We laughed and joked about it afterward; our inside gag afterward was one of us would type the delete command, the other would check the entry, if it looked right, one of us would hold the keyboard while the other stuck his finger over the delete key, and lift the keyboard to "press the key".

The lessons I took from that day were "have a testing system" (and nowadays a staging area as well), and "when working on a production system, always, always, always double check what you are doing, before completing the command or action". To this day, I am glad I had to learn that early in my career as a software developer, and that my boss and his boss (the owner) was cool with it, and we recovered the data.

These lessons have followed me thru my career, and have also been helpful in electronics and such (helps to keep the magic smoke in place).

@cr0sh Dang man! All others seem suckish, but that would just be hell! I can't even imagine that... I would be like :cry: followed by a loud bang and a thud...

Dang man! All others seem suckish, but that would just be hell! I can't even imagine that... I would be like...followed by a loud bang and a thud...

In any other context, it would've been a "career ending move"; I probably should've been fired. Between that and one time telling the owner to "f-off" (seriously, I apologized a couple hours later after I had cooled off), its a wonder I wasn't canned. That's what being a hot headed impulsive kid will do to ya, I guess!

;D

Its probably also what saved my bacon, along with the owner being a good guy to work with and for. He also understood my passion for computing; he helped me buy a new computer not once, but three separate times (an Amiga 2000, an Amiga 1200, and an Acer 486 DX 50 - I still have all three) with interest-free loans for me to pay back on a paycheck to paycheck basis. Eventually I was let go from that job (a blessing, actually - I found a new job a couple weeks later making more money); the company was hemorrhaging money and collapsed not long after - the owner, though, when he laid me off, let me stop payments on the loan until I got back on my feet. Once I got a new job, I started making the payments again; the last time I saw him was when I handed him the last payment in person.

Since that time I've worked for four other companies doing software development, but I will always remember that group fondly - they let a kid with no college degree develop software for them, and essentially laid down the road my career has followed. I learned a lot from that job (from coding standards to how to load a 9-track open reel vacuum column tape drive - that was old-tech even then!), lessons I carry with me to this day and apply daily.

:slight_smile:

Very cool guy! I've always sort of wondered; where do you work now, if anywhere?

Very cool guy! I've always sort of wondered; where do you work now, if anywhere?

Right now I work for a small web application development company (6-10 employees, depending on how you count contractors and such - I'm a full-time salaried employee) called Inexo.

We do the majority of our development work using a LAMP stack (Linux, Apache, MySQL, PHP); we try to steer clients to using such a system, but we're flexible enough that if they need or want something different (ie, maybe they want the AMP part, but want to run under Windows, for instance), we can easily accommodate that. We're really flexible, and our team turns out some great work (IMHO).

I enjoy going to work every day, I love the people I work with (heck, today, the owner of the company and I went out to have a bite to eat and see Apache Reclamation - he was pretty amazed; I'm always talking about going there and picking up junk); its a very casual working environment (jeans and t-shirt), but we're serious about delivering a quality product to our clients, and I think we succeed at doing that.

@cr0sh: actually it was not really "your fault". Whoever sets up an environment for development and testing on production systems is doomed in the first place. That is: it is easy to blame the developers for something like this but actually it was a very big mistake by the people providing the setup.
The point is that mistakes will always happen. Whoever assumes 100% perfect developers - who will never foul up - just does not understand software development.

Udo

I have had many stupid mistakes.
The most common is obviously semicolons but I think my worst was trying to go back to the start of my code by calling loop(); again. Needless to say I ran out of RAM pretty quickly and the code then stopped working. Baffled me for ages! Now I use sensibly use return; :wink:

Mowcius

actually it was not really "your fault". Whoever sets up an environment for development and testing on production systems is doomed in the first place. That is: it is easy to blame the developers for something like this but actually it was a very big mistake by the people providing the setup.

I agree, but there really wasn't any choice. Our company was so small we couldn't afford the same sized hardware we sold our clients; we were using IBM RS/6000 boxes running AIX - we developed our code in-house on an RS/6000 320 (IIRC) - it was essentially a PC-sized workstation (but cost waaaay more than a PC did in those days). For our clients, we sold them 520s (and larger); these were fairly hefty sized boxes (once again for the time), with hefty price tags.

Also - at the time - IBM would do such things as stick a extra drives and memory in the boxes, then if you wanted to "upgrade" your system, you had to call IBM for a tech to come out and (sometimes literally) "flip a switch" (more likely set a jumper) to "enable" the extra drive or memory. You couldn't run down to the local computer store (back then, here in Phoenix, it was Insight), buy a drive, and slap it in without voiding your service warantee. IBM would end up charging you many times the price of what you could get otherwise. We didn't have a choice of what to use, though, because the software development environment we used (VMark Software's unIVerse) only ran on certain Unix environments (none of them cheap).

So - in order to have a good amount of data, etc - while still having room for our code - we had to use client boxes for certain development processes. We tried to only use our development server for most things, but there were times when we didn't have "real world" data (and we couldn't get it a copy off the servers themselves - you can only download so much over 9600 baud leased-lines!) that was needed for testing.

Heck - at that employer were were still doing things like transferring data from punched-cards to 9-track open reel to 1/2 inch tape; just to get the old data into our systems. Laser printers were still a luxury; for most of our stuff we sold large and loud Genicom 132 column line-printers that could run through a box of greenbar in no time (one of the printers I learned how to make it "sing" by setting certain parameters and sending the proper lines to cause large swath dark strikes and such - not good for the hardware, though!).

Maybe it wasn't completely my fault, but I still should've been more vigilant in what I was doing, instead I got lulled by routine, and slipped up, and caused a larger business to lose a lot of money that day. I still consider it a valuable lesson, regardless.

:slight_smile:

I'm not sure it qualifies... but...

Made a C# application, and had a value change in a evaluation statement.
I still have no idea how this could happen, but even the experienced coders of the company were flabbergasted about it!

I used to do all sorts of stupid stuff at school. We learnt turbo pascal, which was dos based.

Probably the stupidest was randomly reading/writing to blocks of memory and hard drive. It wasn't an accident, it was more youthful curiousity. Reading from memory made win95 do some fairly odd stuff.

I think the best programme I wrote was one that mimiced some DOS function but went on to pretend to format your floppy, complete with non-destructive drive access to make the right noise :slight_smile:

My stupidest coding mistake?

Learning BASIC.

To be fair, I wasn't given the choice, and at the time, the school didn't have a modem, so we batch-processed BASIC (yes, you read that right!), so making sure your program was correct before taking the (paper) tape to the local Uni's mainframe for processing was essential, because turn-around was of the order of a day.

Forgetting "0" is a number...

if (c=0){
}

... find the error ...

Imahilus: I once worked at a place where the compiler and debugger could get "out of sync"; your code would look right, you could step thru it with the debugger, but when you ran it, it would error out. It was a really weird issue, but we traced it back to a bug in the debugger (not a good thing to find out about) and let the company which made the development system know about it.

Groove:

Learning BASIC.

I think you mean "learning BASIC at the wrong time" - which unfortunately has tarnished BASIC's image forever (batch processing of BASIC - ack!). Today's modern BASIC dialects are worlds better than what you learned on (heck, QuickBasic 4.5 and PDS 7.1 were light years ahead of what you had - and those both stink compared to today's BASICs).

Check out offerings like BlitzMax, FreeBASIC, GAMBAS, etc...

While still BASIC at a certain level, they all offer modern language constructs; object orientation, in-line assembler, pointers, native compilation, etc.

Note: I am not sure I can include VB.net into the mix - VB.net isn't quite like BASIC, and it isn't quite like C (C#?) - and it isn't quite like VB6 (which I think was the last true "BASIC" Microsoft released). I tend to consider it its own incarnation; a form of BASIC in name only. It is like BASIC like Python is like BASIC (I consider Python to be a great learning and production language too; it might make a better environment to learn in than other languages, and allow an easier transition).

:slight_smile: