Skip to main content

Computer Science, Briefly

Before we dive head-first into Python and the fundamentals of Computer Science, it's important that you understand exactly what "Computer Science" entails. You probably already have your own definition of CS -- and that definition is probably why you're here -- but I'd like to take a moment to try to encompass the beautiful breadth of the field you'll be getting acquainted to over the next few weeks.

Definitions Galore

Merriam-Webster defines computer science as "a branch of science that deals with the theory of computation or the design of computers." Wikipedia takes this a little further, saying that computer science "is the study of processes that interact with data and that can be represented as data in the form of programs." So which one is it?

Professor DeNero's Spring 2020 slides define computer science as the study of:

  • what problems can be solved using computation,
  • how to solve those problems, and
  • what techniques lead to effective solutions.

I personally don't have a definition of computer science, considering that I'm a freshman in college who pretty much just started studying it. As such, I'd pretty much abide by Professor DeNero's definition, as it seems to be the most descriptive and specific.

Diversity & Perpetuality

There is no one thing that computer scientists do. Every computer scientist works in a different realm of computer science, ranging from theory, scientific computing, networking, security, graphics, systems, and most recently, artificial intelligence. Especially with that last one, it's important to note that computer science is a continuously evolving field. More and more things end up becoming a part of computer science, and some older things eventually become obsolete and die out.

The beauty of computer science is that it isn't going away anytime soon. There's a lot of talk about revolutionary ideas floating around all the time, which would make you think that we're bound to hit a peak. We're not. Artificial intelligence very recently gained traction, and the global scale of machine learning means that it's going to stick around for a looooong time. New machine learning models are built every single day, and yet they're never perfect. New techniques are discovered, combinations applied, and feedback improved upon to bind us in a continuous cycle.

Aside from AI and ML, we also see constant growth in things like operating systems. The first version of macOS came out in March 2001, three days after my birthday. The first version of Windows came out almost 16 years before that. It's been 18 years since macOS was released and 34 since Windows, and yet both operating systems are riddled with system updates on a near-daily basis. And they're far from perfect.

On the brighter side, if you pursue computer science, this means a lot of job security :)