The study of what can be computed, and at what cost.
An algorithm is a finite sequence of unambiguous steps that transforms input into output. Sorting, searching, routing — every computable task reduces to choosing an algorithm and proving it correct. The same problem can have a clumsy solution and an elegant one; part of the discipline is telling them apart.
A computer is a tower of abstractions: transistors implement gates, gates implement instructions, instructions implement functions, functions implement systems. Each layer hides the one beneath so the layer above can be reasoned about on its own terms. Composition is the lingua franca.
Correctness is not enough — an algorithm must also be efficient. Complexity theory measures how running time and memory grow with input size. Constant, logarithmic, linear, quadratic, exponential: these asymptotic classes decide whether a program finishes in a second, a century, or never.
Cities and roads, web pages and links, functions and calls — almost everything structured is a graph. Searching a graph breadth-first visits vertices in layers of increasing distance from a source; depth-first explores as deep as it can before backtracking. These two traversals underlie routing, compilers, and social discovery alike.
Long before electronic computers existed, Turing defined a mathematical machine — a tape, a head, a table of rules — and proved that some problems cannot be solved by any algorithm whatsoever. Computer science begins at this boundary: not merely what runs fast, but what can be computed in principle at all.
Pick a field and start reading. Algorithms, proofs, and references included.
Explore computer science