Here's an excellent piece we've been sent by recent ASNC graduate Shelby Switzer. It tells not only of the more unexpected places an ASNC degree can take you in life, but also (much to our delight) how that education travels with you as you go. After leaving ASNC in 2012, Shelby travelled and volunteered in Asia for six months before returning to the USA. She is now a self-taught programmer and works with a number of tech start-ups as a software developer and content writer. In March she will begin a new adventure teaching Ruby on Rails at the Iron Yard code academy in Atlanta.
When I tell people what I do, and then
answer the inevitable “What was you major in college?” I'm usually faced with
exclamations of surprise, bewilderment, or just plain confusion.
My answer to the latter question usually garners some semblance of that
response anyway, even though I always thought that Anglo-Saxon, Norse and Celtic was a
totally normal university course that anyone in their right mind would elect to
take.
But what most people can't seem to piece together is how I went from a degree
where I learned medieval Welsh, recited Latin and Irish poetry, and studied
Anglo-Saxon kings, to a career
that seems so deeply rooted in modern technological culture: programming.
Maybe some of the shock has to do with the
century gap – most of my time in college was spent pouring over texts written a
thousand years ago, and now my daily life centers around languages invented in
the past few decades. I think most of the confusion, though, arises from this
prevailing concept that humanities degrees cannot lead to STEM
careers., which I think stems from an even more troubling idea that the primary purpose of a degree is to prepare you for a career. Both are mistaken.
I'm not here to give my life story on how I
made the “transition” from whatever “normal” humanities-major folk do and what
“techies” do – whatever any of that even means – but to reflect on my
experience and share how my extremely esoteric, impractical, fantastically
interesting, unique, and fun humanities degree did in fact give me skills that
I use on a daily basis. Skills for which I am infinitely grateful, and which
are needed in my field.
1. Communication
I can't even begin to stress how important
this is. My communication skills were improved exponentially when I had to write
three- to five-thousand-word essays every week and discuss
them verbally in supervisions. Now, whether I'm pitching crazy awesome apps
to a potential investor, or more regularly, explaining to clients why a certain
feature addition just isn't a practical use of my time or their money, I have
to be able to communicate well. Contract programming is half coding, half
negotiation.
But on an even more basic level, communication
is key to making good software. How many times have you used a gem
or other program with not only crappy (or nonexistent) documentation, but
obtuse methods that don't elucidate what on earth the code is trying to
accomplish? When have you inherited a piece of software to hack on, only to
find a similar situation, as well as a tight deadline that leaves little time
for figuring out what the previous programmer intended? Computer languages are
great and all, but they're used by humans, and they need to be
well-communicated.
2. The Power of Language
This brings me to my little rant on how
awesome language is, and how central it is to programming. I can just see you
busting out the no-crap face now – “Well, they're called computer languages,
dummy!” – but hear me out. My intense study of multiple dead and living
languages embedded in me an innate grasp of syntax, grammar, and just general
lingustic structure across incredibly different language families. When I first
saw = and == in a Ruby program, I could immediately pick up on which contexts
they were frequently used in (e.g. when one was declaring the value of
variables and when the other was being used in conditional statements).
I never even read the documentation or had a real tutorial before I began
taking = and == and using them (reasonably) correctly. When I first look at a
piece of code, my mind starts recognizing, memorizing, and using patterns
like this, so new computer language aqcuisition is a rather speedy (and
thorough) process for me.
But natural attention to linguistic
structure isn't all that my humanities degree imparted to me in this regard,
but also a sheer joy in linguistic diversity and nuance. When I learned
that a new array can be created by either array = [] or array = Array.new, I
was freaking stoked.
The first is simple and quick, while the second allows for arguments to be
passed into it, like Array.new(2, “baller”) (which yields [“baller”,
“baller”]). Which one you choose to use entirely depends on what you feel like,
or what your situation calls for – akin to how in Irish both cú and madra
mean “dog,” but both have different connotations and would be used based on
personal choice or context.
3. Informed Decisions
As simple of an example as this is,
choosing between [] and Array.new can only be done well through understanding
the range in meaning and usage of each one. These can be learned quickly if
you're already trained in what to look for, and especially if you're already
used to the amazing flexibility, dynamics, and nuance of human languages. But
my humanities degree also trained me for making good decisions on a larger
scale.
The programming community is constantly
discussing what are called “best practices.”
The medievalist community is constantly discussing whether Arthur originated in
Wales, France, or Mars. I know the parallel should be obvious, but in case it's
not, let me explain. When I'm writing a paper on the origins of Arthur and I
argue that there are Byzantine references in some texts that suggest the legend
of Arthur started on the Continent and not in the British Isles, I really have
to do my research. I have to cite scholars who agree and scholars who disagree
– and assess these scholars' credibility. I have to determine if the references
were introduced at the same time the text was originally written, or if they
were introduced later. I have to look at these examples of Byzantine references
myself and determine if they are strong enough references, or even if they're
referring to Byzantine culture at all.
The same goes for when I'm architecting a
piece of software. If I hear about some cool new programming trend, I have
to do just as much research. Does this trend fit with “best practices”? Are
“best practices” – which change frequently, mind you – really the best, whether
overall or just within my current project? Who's promoting the trend, who's
dissing it, and do I respect those individuals' work?
If I'm considering using a gem in my
application, I need to read the gem's code to see if it was even done well
before I blindly just plug it into my program. I want to see who made it, who
uses it (if possible), if it's being maintained, and if it can really fit
within the scope of my project. It's so irritating to start using a gem without
doing enough research and end up ditching it (and having to do clean-up)
because it wasn't suitable or was poorly crafted.
Long hours of research, meticulous
citation, and argumentative writing taught me how to immediately approach
making decisions based on critical thinking and strong research and evaluation,
as well as the ability to change my decisions in light of new evidence or
compelling arguments. These skills are essential when both coding and designing
software. It keeps you from just hopping on the latest code-wagon and helps you
argue your case when talking to your team about the decisions you're making –
which also goes back to the importance of communication.
4. Making the Pieces Fit Together
One of the things I only realized recently
about myself as a programmer, and really as a person, is that I'm good at
keeping the big picture in mind. You could account this to my personality type,
my zodiac sign, or
whatever, but I think a lot of it has to do with my humanities degree.
Why? Because of having to write 15,000 word essays that are at least somewhat
coherent! All those paragraphs and arguments you make and block quotes you use
have to be tied back into your original thesis: everything must be relevant. So
when I'm writing an application, I'm very aware of all my models, objects,
controllers, views, partials, etc, and constantly thinking about how they piece
together and work towards a specific feature's, or the whole app's, goal. Not
saying I don't forget things, but I do find myself frequently asking teammates
how something is going to play with an object or feature they have forgotten
about. I believe I learned a lot of this behavior from writing extensive
essays, pulling together every corner of knowledge I have to fit into
arguments, and trying to keep my structure coherent, cohesive, and concise
– all three alliterative adjectives which I think apply to good programming.
Writing essays,
conducting research, learning other languages, having to defend and communicate
a position verbally and in writing, are all key components of humanities
curricula that can help make better coders, technologists, careerists, people,
dogs, anacondas — you name it. I'm not saying that I'm a perfect programmer, or even a great one,
but I do think that my humanities degree prepared me pretty darned well for
this life of code I've stumbled upon.
Thanks, Shelby.