Agile Zone is brought to you in partnership with:

I am a software engineer, database developer, web developer, social media user, programming geek, avid reader, sports fan, data geek, and statistics geek. I do not care if something is new and shiny. I want to know if it works, if it is better than what I use now, and whether it makes my job or my life easier. I am also the author of RegularGeek.com and Founder of YackTrack.com, a social media monitoring and tracking tool. Robert is a DZone MVB and is not an employee of DZone and has posted 109 posts at DZone. You can read more from them at their website. View Full User Profile

Bad Programmer, Bad Process Or Bad Education

10.07.2010
| 9076 views |
  • submit to reddit

A few articles in the past week prompted some thinking about the industry of software development. In this case, when I say software development, I am talking about developing websites, web applications, commercial software, enterprise software and almost anything else that requires someone who can code. When comparing the industry to other scientific industries, software development is still young, but more importantly software development is one of the few industries that anyone can get into.

First, let’s look at the articles that prompted all of this. On DZone, there was a post asking us to embrace the fact that we are bad programmers. The first paragraph really summarizes the post:

How many developers think they’re good programmers? We’re not. We’re all fairly bad at what we do. We can’t remember all the methods our code needs to call, so we use autocompleting IDEs to remind us. And the languages we’ve spent years, in some cases decades, learning? We can’t even type the syntax properly, so we have compilers that check and correct it for us.


Admittedly, the post does talk about the benefits of unit testing and other good things, but the premise is interesting. Why are we still dependent upon IDE auto-completion? Why do we really need compilers and interpreters to tell us that we typed the syntax incorrectly? Why are we still talking about syntax?

There are really two parts to this problem. The first is the fact that computer aided software engineering (CASE) is mostly a failed segment of the industry. The tools were not that bad, but the problem is that the industry is really not mature enough to properly specify the type of functionality we need in an application. Part of the reason is that software development is hard. It may also be related to the fact that the industry is mostly unregulated. If you look at pharmaceuticals, they are heavily regulated. Architecture and construction are regulated or at least they have rules they must follow. Software engineering has soft guidelines at best. I am not talking about the SDLC processes like Waterfall, XP and Scrum, I am talking about things our code must or must not do. Static analysis tools like PMD or FindBugs may be part of this, but in order for the industry to really mature it feels like something is missing. Another part of this is the certification or licensing required for each of these jobs. Architects typically require an advanced degree to become an architect, and depending upon what you design, you may need certifications or licenses. Scientists at a drug development company require a minimum of an MS degree eventually, and in many cases a PhD. You cannot become an architect or a scientist without having these advanced degrees or certifications. I am not calling for certifications in software engineering, but this is one of the few industries where you can just pick up a book, start coding and get a job. This does not always mean that these developers are good developers, but just that anyone can become a developer.

The second part of this is that the software development industry is that the industry has become heavily focused on the application side of computer science. I believe this was brought on by the industry itself. When I was in college, there was already pressure from industry to ensure that people graduating with a degree in computer science could be immediately productive in a business environment. The standard curriculum will always change due to the rapid pace of change in our industry, but changing the curriculum to fit what the industry wants may not be the answer. You would hear that developers coming out of school did not know how to build an application. I would argue that the point of education is to teach students the basics of a subject and to teach students how to learn more. If we target students to be more marketable to industry when they graduate, we are losing some of the core knowledge we are supposed to give students. Some of the “leading” universities still have programs that teach a breadth of information, but some smaller or more local universities are going the “marketable graduate” route.

The other post that prompted this was from Dave Winer regarding doing more computer science.

I’d like to see young computer science students get up and present to their teachers some new computer science. Some new use of graph theory. Or an old one re-applied to a modern world. We used to do that, when I was a comp sci grad student in the 1970s. I think we got way too caught up in the commercialism.


Why is this a problem? Well, the real issue is that in order to get to the scaling specialist realm of expertise (or any specialization) or even to build one of these fancy social applications , you need to know a lot of information:

  • Basic data structures – What is the difference between a linked list and a hashmap?
  • Basic algorithms – There are several sorting algorithms and many suck. There are also different algorithms if you are memory constrained, CPU constrained or disk constrained.
  • Object Oriented Programming and other programming types – Object oriented programming is not the only type of programming. There is also procedural, functional, logic programming, and other strange paradigms. Knowing some basic information about these can really make you think about a problem differently.
  • Database systems theory – Basic knowledge of how databases work is helpful in understanding why your application is slowing down. The NoSQL movement may not be the same as RDBMS technology, but some of the core concepts still come from standard database theory.
  • Theory of Computation – There is a lot of theory behind the often mentioned Big-O notation. This helps you understand you why your algorithm is so damned slow.
  • Statistics and Probability – Much of computer science theory is based on mathematics. Statistics and probability also helped forge many concepts in machine learning and artificial intelligence.
  • Machine Learning and Artificial Intelligence – This is typically kept outside of the standard curriculum, but many applications are based on recommendations, categorizations or prediction. A basic understanding of what is possible would be enough to give you an idea of where to look next.

This is just a basic list of foundational knowledge. As you dig into the more advanced topics, you can find tons of interesting information. Many graduates of computer science do not take courses in each of these areas. I believe this focus on job readiness has forced us to skip the courses that help us create the next computing paradigms. This is why Dave Winer complains about our need to do more computer science, because less computer science is being done in schools. He talked about the companies at TechCrunch Disrupt recently:

The products were all cut from a small number of molds. Somehow they have fixated on turning the internet into a game.


What does all of this have to do with each other? The lack of tools to formalize the process could be due to our lack of foundational computer science. There is a large breadth of information that needs to go into tools that would give us standard rules of programming. The lack of education in advanced topics also hurts computer science innovation. When computer science innovation lags, there are less opportunities for “world-changing” computing technologies. Even if we do not target world-changing ideas, a greater breadth in education would give you different ideas on what you could accomplish.

It may not be that you are a bad programmer, we all make mistakes. It may not be that you are using a bad process, even though we all know that the Waterfall process has its problems. You may not have a formal education in computer science either. All this being said, it does not excuse us from learning from our mistakes, learning more about the software development process, and learning more about areas of computer science that are unknown to us.

References
Published at DZone with permission of Robert Diana, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Jilles Van Gurp replied on Thu, 2010/10/07 - 12:35pm

The first few years in industry tend to be very much an apprenticeship where you learn about the technology you will work with, the processes used to collaborate with others, and most importantly from your own mistakes. It helps if you understand the technology and computer science. But frankly, I rarely spend much time thinking about algorithmic complexity, correctness proofs, how to build a compiler, how to fix some database internals, etc., which were all topics I was taught about in university. I know engough about those topics from my academic years to get started if I ever need to. But none of these topics really matter very much to my work.

In my view, the real problem with computer science education has always been that there are relatively few experienced software engineers streaming back into an academic career to teach software engineering. As a consequence, most software engineering courses teach outdated/counter productive practices because the teachers lack the necessary skill or experience to do a proper job.

I know because I actually did a phd on software engineering and worked for five years with very talented software engineering researchers of which I would (in retrospect) hire exactly none to do any software engineering because none of them had (at the time) much experience developing real software for a living. I learned to practice my own research field after I got my phd. One of the reasons I left the academic world was because of this: I knew I had never worked with any proper software engineers and that I wasn't one yet.

The real world is not science driven but experience driven. When it comes to the science of software engineering, it falls into two worlds: the academic tower variety, which is full of people employing scientific techniques such as formal proofs, and the empirical variaty that involves things like case studies, statistical evaluation, metrics, etc. Software engineering is all about solving problems in the real world and if you lack empirical skills you are likely to come up with the wrong solution to the wrong problem without even realizing it because you misunderstand the problem; because you lack the skills to evaluate the success of your solutions; and because your solutions probably do more damage than good.

A good software engineer knows his technology and knows how to depend on empirical evaluation rather than opinion to make decisions. The latter is a skill that needs to be taught in university because currently it is all about the science and not about empirical evaluation. TDD/BDD is an empirical practice and Scrum/agile are all about empirical. We need people who understand how these practices came into existence and why they work.

Stephane Vaucher replied on Thu, 2010/10/07 - 3:48pm in response to: Jilles Van Gurp

I agree that soft. engineering curriculum should at least provide training as to how to scientifically evaluate a practice/tool. IMHO, in an undergrad program, there are two major problems:
  1. Most of the time is spent trying to train someone to make him a relatively productive future worker. There is little time to train for someone to scientifically evaluate a practice/process/tool
  2. It is hard to abstract away the act of writing code for a student to correctly evaluate what are good/bad practices. It often requires someone with some relevant work experience to do so.

Jared Richardson replied on Fri, 2010/10/08 - 9:26am

Great article!

Emma Watson replied on Fri, 2012/03/30 - 5:11am

One of the frustrations for me as a long-time tech guy who is fairly new to software development is that I realize more and more how time-consuming even small development projects can be to create and later maintain. In a business environment, with close user interaction, I’d say there are lots of opportunities for tiny “one-off” projects to spring up, and these are just the sorts of applications which can later consume a lot of time as they are likely to be built without robust testing, monitoring, and configurability in mind.

JDBC

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.