whoami

I'm an undergraduate at The Cooper Union, nominally majoring in Electrical Engineering, though the "major" part feels less like a fenced yard and more like the particular set of tools I've decided to keep closest to hand. I came to EE the way people sometimes come to a new city: not because the old place was bad, exactly, but because the street map here seems to contain more of the roads I keep trying to drive on: computation, signals, systems, the whole strange business of turning the world into representations and then doing something useful (or at least illuminating) with those representations.

One durable aspect of me is the desire to understand what's going on, and then to apply that understanding in a way that touches reality, ideally without breaking too many things in the process. Which is why I keep circling intelligence and simulation: intelligence as the question of how behavior can look like "knowing," and simulation as the disciplined daydream where you can build a world, perturb it, measure it, and learn what survives contact with constraints.[1] Lately I've been learning more about computation, what it costs, what it buys, what it can't buy, and how it seems to rhyme with intelligence in ways that start to feel less like metaphor and more like physics.[2]

I also think EE is going to be the central terrain, no matter what the headline says the future is about, because everything ends up as energy, information, and hardware somewhere. That belief is part of why I switched from mechanical to electrical: the questions I wanted to ask kept sounding like circuits, not gears.

And, if I'm being honest in the slightly embarrassing way a "whoami" demands, my purpose is partly to take advantage, gratefully, of being alive at a time when so many human feats are already built and so much knowledge is just... there, waiting, accumulated; and to balance all that engineered brilliance with the other kind of brilliance, the natural kind, the kind that doesn't care whether you can derive it, only whether you can notice it. And yes, I did ask ChatGPT to write this like David Foster Wallace (which is either an admission of outsourcing, or an attempt at honesty-by-preemptive-confession, or both, depending on how charitable you're feeling).

[1] Also: simulation is where you can fail cheaply and frequently, which is a very underrated educational privilege.

[2] Information theory keeps showing up like an uninvited guest who is nevertheless always right about the seating arrangement.