From meetup groups to job listings to titles on LinkedIn, full-stackers are all around. but for a term so prevalent it is surprisingly ambiguous. What is a “full-stack” developer? What does she do? What is the scope of her work? Does our company need one? Practical questions I imagine have crossed the minds of many VP R&Ds and CTOs. It would have been nice to have a common definition, but human languages is dynamic and evolves, so fragmentation and ambiguity are part of the game. Mostly it works and we can rely on our intuitive understanding of the terms used. However, in some situations it’s worth choosing a more accurate term, or even use a full sentence to describe what you mean because ambiguity might be costly. For example, when writing a resume or a recruitment ad you might want to be explicit about what it is you are looking for. Sadly, this is exactly where we see the ambiguous term “full stack” used the most - I tend to think of resumes and job listings as marketing material so I expect them to be hip and ambiguous, but this does present a problem: how do I know if “full stack” is right for me?
If we want to understand what people are trying to communicate with full-stack - the problem they are trying to solve, it helps to look at the process and context behind its use. Or in other words, the “how” and “why” of how full-stack came to be.
Let’s look at some data and try to construct a narrative that makes sense. First question: “fullstack”, “full-stack” or “full stack”?
Is it commonly referred to as developer or engineer?
The graph also clearly show that “full stack” came into popular use around the second half of 2013 and exploded from then. I went hunting for the early mentions of “full stack” and have gotten as far back as 2005 but it seems to have been used to refer to things like “full-stack java framework” - obviously very different then current use. The first reference I found for “full stack” as a job title or a type of developer was this blog post from June 2008. But we’ll get back to that later, because we have a more interesting question to answer: what happened in 2013?
I’ll give you a hint
In the second half of 2013 mobile browsers gain more market share than Internet Explorer and Browser War II is effectively won. So what happened? Frontend commoditization happened.
You see that region where ActionScript becomes less popular than Bootstrap and Angular? Right there. That’s when frontend started to be commoditized and full stack became a thing. Angular, Bootstrap made javascript/css based web development easier than ever along with package managers like NPM and Bower that were launched around the same time (NPM registry opened January 2014, Bower released September 2012). Around the same time something else happened, on the bottom half of the stack:
Docker was released on March 2013 and was a driving force in the commoditization of backend development. These are but a few examples of a myriad of events that happened around that time and made web development easier, for example CDNs become commodity and browser standardization starts to happen. But all pale in comparison with the cloud. Until 2013 the cloud was a niche market used mainly by tech companies and people still believed in just “build a better datacenter”:
By the end of 2013 it started to become clear the cloud is the future and AWS started investing heavily in expanding into the general market. For example on November 2012 AWS hosts its first customer event and in April 2013 launches “General Certification Program”. At the same time, Google Compute Engine went GA and Satya Nadalla was named CEO of Microsoft suddenly making Azure an actual player in the Cloud game. From that year onwards, everyone is starting to adopt the cloud and rapid commoditization of SaaS occurs.
The origin of “full stack developer”
Let’s back up a bit to 2008:
Why would anyone need a “full stack” developer when the stack isn’t fragmented? So naturally “full stack” was a reaction to the newly created divide between “backend developers” who until then were just “developers” and the new order of “frontend developers”. This divide started happening around 2008 when smartphones became a commodity and with the fragmentation in the browser market made anything to do with rendering a dark art. It was still done on the server back then, but you needed experts on that particular problem. But we’ve made progress and by 2013 both sides of the stack were commoditized to the point many people could be productive working on both sides. It’s interesting to point out how all three graphs keep going up after 2013. From 2010 to 2018 the number of developers almost doubled according to US census data and several other surveys. Have a look at Stack overflow developer survey: 20.5% of developers have less than 5 years experience, 51.5% less than 9 years. From the survey:
About 50% of respondents identify as full-stack developers, and about 17% consider themselves mobile developers. The median number of developer type identifications per respondent this year is 3, and the most common pairs are combinations of back-end, front-end, and full-stack developer.
Where did they all come from? They couldn’t have come from backend or frontend, because they were also growing like crazy; It’s pretty safe to assume that massive growth was new people becoming developers. In 2013 the “Fullstack academy” opened in New York and started pumping out “full stack” graduates of their 13 week “full stack” development course. Whether they rode the hype or created it is immaterial, but I’d say this event marks the point where “full stack” as a profession started to boom. In a sense, this is just a case of Jevon’s paradox - we’ve commoditized web development making it is easier and cheaper to build generic web applications and demand soared. We are slowly but surely making software accessible to more and more people, realizing a 40 year old dream. This is a good thing, but not without its prices; the commoditization and rapid growth were fueled by exponential growth in CPU power and RAM sizes. A chat application now consumes 300MB of RAM but it can be developed in record time - because you don’t have to deal with entire stack, on the backend or the frontend. Slowly but surely the stack is being divided again, but not between “backend” and “frontend” but between infrastructure and applications. Did you notice how cloud providers are also massively involved on the client side? Google literally owns the client side engines now. Frontend was never Amazon’s strong suite, but they are working very hard to make up for it (expect some heavy munitions soon). Microsoft lost browser wars but has recruited an army of JavaScript developers (as in, core developers).
When something becomes a commodity, it also becomes abstracted. The hordes of new developers never had time to learn the entire stack because we wanted them to write business applications. Contrary to what people usually imagine, the majority of developers aren’t working for Amazon, Google, Apple or other giant tech companies, they aren’t working in startups either - they are working in banks, governments, large enterprises and so on. Do you think a developer in Walmart writes high scale code to serve ads on a global scale? Or a travel expenses management system for corporate employees that will never have to scale beyond 1 server? and these are developers who aren’t using the cloud yet.
“full stack developer” in the pre-commoditization era
Before 2013 and the technologies that commoditized web development there were a few mentions of “full stack”, but what did the term mean before “frontend” and “backend”? How did it evolve?
The first mention of “full stack” as a job or skill is a blog post from June 2008 and the definition on that blog was
A full stack web developer is someone that does design, markup, styling, behavior, and programming
Which doesn’t say anything about frontend or backend, not surprising as this divide only just started. The term “full stack” was relatively oblivious back then and the same argument is given under a few names, e.g. this piece from April 2008 calls it “T-shaped developer”. The next mention in October 2010 is by Carlos Bueno (of Mature Optimization handbook fame), who references the original blog post from 2008:
A “full-stack programmer” is a generalist, someone who can create a non-trivial application by themselves. People who develop broad skills also tend to develop a good mental model of how different layers of a system behave. This turns out to be especially valuable for performance & optimization work. No one can know everything about everything, but you should be able to visualize what happens up and down the stack as an application does its thing. An application is shaped by the requirements of its data, and performance is shaped by how quickly hardware can throw data around.
Again, no reference to either backend or frontend but rather describe a pretty senior engineer who can analyze complex systemic problems. In August 2012 we see another mention from someone who was told by a Facebook employee at OSCon they were only hiring “full stack” developers (which at the time meant “very senior developer”)
Is it reasonable to expect mere mortals to have mastery over every facet of the development stack? Probably not, but Facebook can ask for it. I was told at OSCON by a Facebook employee that they only hire ‘Full Stack’ developers. Well, what does that mean?
To me, a Full Stack Developer is someone with familiarity in each layer, if not mastery in many and a genuine interest in all software technology.
And right there, as early as 2012, before full stack became popular, we hear the same basic criticism of the term. Actually, in June 2008 someone already tweeted something in this spirit but the context is unclear:
@dozzermon also "Full stack developer/engineer" I'm sure it has some explanation but it sounds a bit daft/poncy to me personally.
— Richard Pearson 🐝 (@catdevnull) June 16, 2008
After 2013 when “full stack” became viral and hordes of inexperienced developers joined the industry, it quickly became obvious the old meaning as articulated by Cralos Bueno was a thin stretch. We start seeing more and more criticism on this point, and by the end of 2014 Peter Yared suggests rename is in order:
I’d wager that there are zero individuals with advanced-level knowledge in each of these areas that would be capable of single-handedly delivering this next generation kind of application. Just keeping up with the advancements and new programming interfaces in each category is almost a full-time job… Rest in peace, full stack developers. Welcome, full stack integrators, in addition to engineers with deep technical skills in particular areas. It’s a fascinating world of software out there and we need you more than ever.
Summary
“Full stack developer” came into popular use mid 2013 and caught on like wildfire. It is closely linked to the Cloud age and commoditization of web technologies on the front and back ends and can viewed as a reaction to the growing fragmentation of “backend” and “frontend” that started around 2006-2008. Demographically, “full stack” characterized by relatively junior developers who are using commodity technologies to create applications.
Clearly the term “full stack” evolved and changed from its original meaning in 2008. So what is “full stack” today? I’d say one of the following:
- Proficiency in both backend and frontend development
- Generalist: UX, frontend, backend, product, data analysis, etc…
- A frontend developer with just-enough backend skills
- A backend developer with just-enough frontend skills
If you subscribe to #1, then the question is, what is your bar for “proficiency”? Commoditization suggests this bar is dropping very fast. As far as I’m concerned under this definition there is only one true full stack engineer and his name is John Carmack. I would describe the modern full stack developer as
A person who can build a generic web application on his own
Note how “non-trivial” has been omitted. I think this description in much more in line with what we see in the industry.
The real question however, isn’t what I or someone else thinks “full stack developer” is although it is somewhat related. The question is: given the ambiguity of the term should we use it? Words have connotations and I suspect “full stack” was originally chosen in 2013 because of those connotations. It implies mastery of the entire stack and this contradiction with the realities of our industry is just confusing. I would suggest “web application developer” or “application developer” which I think communicates the reality of the job more clearly.