Thoughts on entering software development after 20+ years
I am at a crossroads in my software career, which has made me a little introspective. I have been considering what I would do differently if I was to enter the software industry now, or what advice I would give someone just starting out.
I’ve been working in software since 1995. My first professional programming task was to fix y2k bugs in an embedded system which hadn’t shipped. I graduated in 1997 at the height of the .com boom and developers were being sucked up by the burgeoning industry at an unprecedented rate.
This was simultaneously a great time and a terrible time to enter the industry. While high paying jobs (relative to 20 plus years ago) were readily available, it was possible to get away with less technically. Knowing the language and few CS fundamentals was enough to write your ticket into the industry. For some, knowing HTML alone was enough. Leaner times would have required honing more advanced skills or pursuing advanced degrees.
Education
For many, myself included, the easiest way to enter the industry has been a CS degree, but now I would be hesitant to recommend going that route. The best software developers I’ve worked with have come from Math, Engineering, or Physics backgrounds (although I’ve worked with strong developers with no degree at all) and are self taught programmers. If you consider high growth areas of the industry: data science/visualization, machine learning, virtual reality, etc, they all require a strong Mathematical background, and I don’t see this trend subsiding.
The barriers to learning programming have dropped significantly in recent years. When I entered the industry commercial development tools like Borland C++ were the standard and still relatively expensive for students. Today unbelievably powerful computers are cheap and ubiquitous. High quality compilers, operating systems, databases, development tools, application frameworks, hosting environments, online tutorials, and documentation are now either free or inexpensive. If you have the aptitude and motivation programming can be learned independently.
Regardless of how you learn programming, self study is inevitable. Many of today’s popular languages and platforms didn’t exist when I was college. I was taught a now virtually unknown programming language (Modula-2). This was controversial because there wasn’t demand for the language in industry. But ultimately it didn’t matter, because the popularity of languages changes frequently enough that it is irrelevant which language you learn first. What matters is learning the fundamentals, which have remained surprisingly constant.
Open Source
In 1995 I owned a 32 bit Pentium PC with 8megs of RAM. The most common PC operating system was 16bit Windows, and its segmented memory model felt archaic when compared to the Sun workstations we had in the CS lab. Eventually someone introduced me to a free OS called Linux. I installed a Slackware CD distribution which came in the back of a “Running Linux” book I bought at Borders, and soon went down the rabbit hole of compiling kernels to get my hardware to work (amazingly at the time the kernel didn’t have dynamically loadable modules).
For CS students Linux was a boon. We now had the same tools at home that we had in the lab. But the idea of running software with freely available source code hadn’t entered the industry mindset. These were the days of shrink-wrap and packaged software driven primarily by Microsoft, and enterprise software meant buying an Oracle database.
Fast forward 20 odd years and now every major software company, including Microsoft, has a major investment in Linux and Open Source. Almost every major software development platform is Open Source. The rise of Open Source has in part lead to the end of software as a product which has coincided with the rise of services.
Open Source accelerated the build out of the internet (Google probably wouldn’t exist without Open Source), and made it far less expensive for start-ups to launch new applications, but it has also changed the value proposition of software, and there is an increased expectation that high quality software, especially for infrastructure, will be freely available.
Competition
In the 90s CS was still a fairly niche major, which when combined with significant demand made it relatively easy to enter the industry. And after the .com crash, CS enrollment fell further. But as technology has weaved its way deeper into society with the promise of a high paying career, interest in programming and CS enrollment has exploded. This has coincided with demand as the industry continued to grow, but I think we will reach the point where the supply of developers will outstrip demand. The last few years has brought an increased concentration of power to major tech companies, and as the infrastructure required to compete with the major players continues to grow, I believe we will continue to see a few companies dictate the direction of the industry.
Top candidates (especially with in demand skills like AI) will still be able to land lucrative jobs with the major tech firms, but I believe employment and salaries will drop off quickly at the second tier. This hasn’t happened yet as VC fueled startups are still paying high salaries and driving demand, but if the economics of VC funding changes this could also change quickly.
As recently as 20 years ago computer literacy or, in some sense, the ability to operate computers had intrinsic value in and of itself. In the early days of the internet, many of us were generalists who did networking, system administration, and software development. Today no one considers operating their smart phone a valuable job skill. System operations are becoming concentrated at major cloud providers, and it is possible to spin up a new website in minutes. This is good as it allows programmers to focus on application problem domain, but it also means the value of some operational knowledge has been lost, and value for software developers is in deeper skills.
Expectations
In the 90s software quality was pretty bad. Windows crashed frequently. Data was corrupted. Security was often non-existent. Major projects failed to ever get off the ground. At the time it was considered a crisis, and an entire consulting industry popped up promising to address the problems with new fangled methodologies. Users expected software to fail.
Today, projects still fail and security is a major concern, but overall software has gotten amazingly good considering the amount of complexity in a modern application. The expectations of what software can do for users has increased significantly. App stores are overrun with applications at little or no cost. Creating real value for users requires increased sophistication. Just as an example, it is no longer sufficient to build a photo sharing application. With the popularity of SnapChat and Instagram users now also expect interactive filters, which require advanced programming techniques.
Take the big company job
When I graduated I had offers from multiple large companies, and one startup company. I took the job at the startup. This turned out to be a mistake. It was chaos and the company soon failed. In retrospect I think working for the large company, even for a few years would have provided a more solid foundation to launch my career. Not only would I have I have been able build my resume, I would have been able to learn the trade in a more stable, successful environment. I think it is fair to say that the big tech companies are successful because they are doing a lot of things right. I’m not advocating avoiding the small company experience, personally I enjoy working for smaller companies, but if you are fortunate enough to have an offer from one of the big tech firms early on in your career, I would take it. You can always work the small company after you’ve gained more experience.
End of the wild west
With the Cambridge Analytica scandal making headlines, and increased regulations including Europe’s GDPR and the US’s FOSTA/SESTA, the days of anything going on the internet are coming to an end. I won’t make a political judgement as to whether this is good or bad, but having worked in a heavily regulated domain, I will say that regulations do not make it easier on small businesses. I do think regulations will favor larger companies who have the clout, resources, and legal talent to navigate an increasingly regulated environment.
AI changes everything
AI is the big unknown in the future of the software industry. It simultaneously intrigues and scares me. I think it has the possibility of being the biggest revolution in the industry since the internet. The progress made in the past few years is nothing less than amazing. What makes it scary from a practitioner’s point of view is that it is far enough outside the domain of traditional computer science that it feels like an entirely different field. AI research existed when I was in college, but it was a time when most failed to take it seriously. While all the effort moved to the network, a few souls stuck to AI research and now it is paying off. I think the future of the industry hinges on what happens with AI, and it could make a lot of traditional applications obsolete.
Good luck
I don’t mean to dissuade anyone from joining the software industry. Working in the software has brought me many opportunities which I’m thankful for. At the same time, if there is anything to expect from the industry, it is constant change and learning. I wrote this as a reminder to myself, with sweaty palms, that world is a different place than it was 20 years ago, and it requires adaptation.