First Light

Signal & Memory
By Daniel J. Dodson
“Near Abilene and Amarillo, where wind turbines stretch across the plains like rows of white tuning forks, Alphabet/Google has built new facilities designed to inhale vast amounts of renewable power directly from the generation source. No transmission losses. No detours. The servers eat wind for breakfast. Beside them rise the most quietly revolutionary power systems of the decade: gravity batteries. Not lithium. Not molten salt. Not vats of chemical slurries. Just cranes, pulleys, motors, and gigantic concrete blocks the size of small garages, an engineering feat that would have impressed Archimedes and baffled Edison.”—Daniel J. Dodson

I. Tending the Light
AUSTIN Texas—(Hubris)—January/February 2026—Before the photons arrive from galaxies we’ll never visit—before telescopes catch them, digitize them, and send their faint, ancient messages cascading into petabyte archives—there is a quieter sort of first light. Not the kind that crosses a cosmic void, but the kind that spills out the front window of a neighborhood bookstore, warming a sidewalk still half-haunted by memory.
The building used to be a US Post Office, back when stamps were small acts of faith rather than historical curiosities. In 2022, the mailboxes and brass slots gave way to shelves, and the old sorting room and worn floors became the First Light Book Shop—tucked in just a few blocks from one of Austin’s moonlight towers, those improbable 19th-century iron giants that were supposed to banish darkness and instead became local landmarks of stubborn optimism. On certain evenings, the tower here still hums to life, throwing its broad, deliberate arc of illumination across a neighborhood that has reinvented itself a dozen times without quite forgetting what it used to be.
It’s here, in this little district of light and recollection, that librarians and booksellers tend to knowledge the way lamplighters once tended flame—trimming the wick, adjusting the mantle, keeping the glow steady. They don’t say it aloud, but every well-placed book is its own lantern. Every reader who wanders in becomes, briefly, part of the maintenance of civilization. There are grander projects in the world, yes, but few as reliable.

Two Kinds of Light
Meanwhile, thousands of miles away and at a scale that almost laughs at human comparison, two instruments—the Vera C. Rubin Observatory, which achieved its first light in June 2025, and the Nancy Grace Roman Space Telescope, scheduled for launch in May 2027—prepare to receive a very different kind of illumination. Rubin is already scanning the sky, collecting photons that have been traveling so long they predate nearly every story humans have ever told. Soon, Roman will join it, sweeping the infrared in wide fields; assembling a map in wavelengths no eye can see.
Together, these instruments will generate more data in a decade than all astronomers combined have gathered in human history. At which point we will discover—somewhat sheepishly—that collecting the universe was the easy part. Understanding it is harder.
These telescopes will not merely observe the cosmos. They will inherit it. With inheritance comes responsibility: to store, interpret, share, and steward a volume of information that outstrips the old metaphors of “archives” or “catalogues” entirely.
For now, though, it is enough to stand on the sidewalk outside a small bookstore in Austin, watching the warm spill of light on the pavement, and recognize that all first lights—cosmic, civic, or human—begin the same way: someone, somewhere, tending a flame.

II. The Telescopes’ First Light
Rubin’s First Light
The warm glow of the bookstore fades as the scale expands—streetlight to moonlight tower, moonlight tower to sky, sky to the southern hemisphere, where the Chilean Andes rise in layers of stone and snow like the ribs of the Earth. It is here, at 8,900 feet above sea level, that the Vera C. Rubin Observatory captured its first light in June 2025. Its 8.4-meter mirror—so perfectly curved it seems almost embarrassed by its own precision—opened its eye to a universe that had been knocking politely for 13 billion years.
Rubin didn’t just take a picture. It began a long-planned, meticulously choreographed performance: scanning the entire visible sky every few nights, again and again, like a cosmic night watchman on patrol. Each sweep produces roughly 20 terabytes of raw data—1.28 petabytes per year—numbers that seem manageable right up to the moment they aren’t. Rubin will do this for ten years. It does not get bored. It does not need weekends. It also does not require committee meetings, which gives it a strategic advantage over every human institution.
Nearly 100 million objects—stars, galaxies, asteroids, and wandering specks not yet dignified with a category—will appear and reappear in its images. Some will flicker. Some will drift. Some will lens, brighten, or behave in ways that force astrophysicists to sigh deeply while rewriting grant proposals. This is Rubin’s purpose: to catch the transient, the subtle, the fleeting, and the things the universe only does when it thinks we’re not looking.

Roman’s Wide Eye
Orbiting high above, the Nancy Grace Roman Space Telescope waits for its turn. Scheduled for launch in May 2027, Roman will not match Rubin’s tempo, but it will exceed nearly everything else in reach. Its wide-field infrared vision will map galaxies, detect exoplanets, and conduct the kind of cosmological census that once required multiple observatories, diplomatic coordination, and the patience of saints. Roman is sometimes called “Hubble’s wide-eyed successor,” a description that undersells it: Hubble’s deep field is the size of a grain of sand held at arm’s length; Roman will capture a field one hundred times larger in a single exposure. If Hubble made humanity look up in awe, Roman may be the one that makes us quietly sit down.

The 60-Petabyte Problem
Individually, these observatories are ambitious. Together, they are unreasonable. Across their first decade, Rubin and Roman will produce roughly 60 petabytes of raw data (over 500 petabytes archived after calibration and metadata)—more than all astronomical observations ever recorded, generated in the span of a single adolescence. The comparison is not poetic. It is painfully literal.
Which leads to the question astronomers gracefully ignored until it could ignore them back: what exactly does one do with 60 petabytes?
You can’t compress it. Compression works for text because language repeats. It works for video because frames rhyme with one another. It works for email because half of every email is an enthusiastic signature pretending to be essential. But astronomical data is designed to avoid redundancy—a billion individually valuable photons arriving from a billion unrelated sources. Any pixel could contain a supernova precursor, a new exoplanet, or the faint distortion of dark matter bending the rules. You cannot summarize the sky safely.
A wry corollary follows: the universe refuses to cooperate with our storage budgets.
So, the raw images must be kept. The calibration frames must be kept. The transient catalogs must be kept. Even the boring nights must be kept, because the universe has a habit of hiding miracles in the margins of reasonable evenings.

The Successful Catastrophe
Which brings us to the next layer of irony: astronomers succeeded too well.
For decades, they fought uphill battles for funding, design approvals, international agreements, prototype mirrors, audits, rewrites, and procurement delays—the whole bureaucratic obstacle course of modern science. And now, having achieved their dream, they face a new adversary: their own success.
Rubin and Roman will not be limited by optics, orbit, or the ability to gather photons. They will be limited by our ability to keep up.
A senior scientist described the situation—only half joking—as “like drinking from a fire hose pointed at a hummingbird feeder.” Another compared it to “trying to read three million novels being written simultaneously in Sanskrit.” One researcher suggested that the real solution might be teaching the telescopes to blink. Rubin, characteristically, ignored the suggestion.
The metaphors differ, but the message is clear: we built machines capable of capturing the universe faster than the universe can comfortably be understood.
The bottleneck has shifted. For centuries, the challenge was collecting enough light. Now, the challenge is deciding what to do with all the light we’ve collected. The problem is not telescopes. The problem is architecture—computational, logistical, and physical.
Because the real question, the one that lingers patiently at the edge of every discussion about Rubin and Roman, is this: where does all this data go, and how does anyone make sense of it?
Not into a stack of hard drives. Not into a cloud bill shaped like a national budget. And not onto a graduate student’s laptop—unless the plan is to store roughly the first three minutes of the first night of the first survey and then call the dissertation complete.
This is not a storage problem at all. It is an infrastructure problem—one that demands an entirely new architecture, capable of receiving data at the speed of light and thinking almost as quickly.

III. Exascale Computing and the Quiet Industrial Revolution
The telescopes gather the light. The astronomers need revolutionary machines that can make sense of it rapidly enough for us to learn something new.
If the telescopes are the eyes of this new era of astronomy, the exascale computing systems waiting downstream are the lungs—vast mechanical diaphragms expanding and contracting to metabolize every photon Rubin and Roman send their way. And the truth, whispered with increasing urgency in observatory control rooms, is that humanity can no longer merely store the universe. We must interpret it in real time, before the cosmic events politely excuse themselves and vanish.
Rubin alone generates 20 terabytes a night. Roman will add torrents of infrared detail. Together, their output arrives not as a gentle stream but as a continuous, thundering flow—an astrophysical joke with a punchline that always lands on data engineers: you cannot save the universe for later. The sky moves. Stars flare. Asteroids wander. Gravitational lenses bloom like cosmic bruises and fade again. If you wait too long to process the data, the event you’re looking for has already left the building, signed the guestbook, and is halfway to Andromeda.
This is why processing must happen as the photons arrive—not months later, when research groups have grant deadlines and excellent intentions. Machine learning systems now scan each frame for transients, track objects across the sky, mark potential exoplanet dips, and flag anything that appears to violate the polite laws of physics. These models require exascale performance: computing systems capable of 10^18 operations per second, roughly the number of grains of sand in a medium-sized riverbank—calculated every second of every night for a decade.
And here lies one of the quiet ironies of our moment: for the first time in human history, astrophysics is limited not by astronomy but by computing throughput. Rubin and Roman image the cosmos flawlessly. It is the rest of us who must catch up.

The End of the Electronic-Only Era
Electrons, for all their usefulness, generate heat when pushed, scatter when rushed, get confused, and wander into places they’re not wanted—something like interns during their first week on the job. Engineers have shrunk transistors to near-molecular scales, stacked them, strained them, doped them, cooled them, and occasionally whispered encouraging words to them, and yet the limits are so tight that any further gains feel like acts of divine intervention.
When electrons are asked to carry the massive volumes of data produced by the Rubin and Roman telescopes—across a chip, across a board, or across a cluster—the physics turns against them. The more data they carry, the more heat they generate, and the harder the system must work to keep pace and keep its cool.
Consider the mismatch: it’s like asking a reader in Peoria to locate one hundred unspecified books at the modern Library of Alexandria and summarize them in 10,000 words—in the next five minutes.
Photons, by contrast, do not share these limitations. Light crosses a chip with almost no resistance and little heat. It glides through the architecture like a guest who knows exactly where the silverware drawer is. Even more important, light can encode information in several independent ways—not just on/off binary states, but wavelength, phase, amplitude, and polarization. One photonic channel can behave like a small orchestra of distinguishable states.
But photons cannot yet compute. They move beautifully but do not add, multiply, compare, or sort with the reliability we expect from a processor. Which is why engineers are building hybrid photonic-electronic architectures, where electrons manage logical operations while photons handle movement—like a relay race in which one sprinter crosses continents while the other is exceptionally good at climbing the stairs to the library.
This is not hypothetical. Prototypes exist. Startups are filing patents that read like fanfiction co-authored by Maxwell and Turing. The bleeding edge is already lightly scratched.

A Light Touch on Radix-n
Binary logic—“on/off,” “zero/one”—served humanity admirably from the telegraph to the smartphone. But as chip architectures grow more complex and the cost of moving electrons rises, engineers are revisiting an old question:
Why only 0 and 1? Why not 0, 1, 2? Or 0–3? Or 0–7?
Higher-radix systems allow more information per operation, fewer steps per calculation, and more compact structures for certain algorithms. Ternary logic (base-3) had its moment in the 1960s. Quaternary logic (base-4) has advocates today. Ultimately, the future may belong to radix-n systems—architectures using whatever number base is best suited to the task at hand.
Of course, this all sounds perfectly reasonable until someone tries to design the hardware. The combinatorial complexity escapes human intuition almost immediately. Binary logic is difficult enough; radix-n logic requires layout optimizations that no engineer can track without violating workplace safety guidelines.
This is where machine-generated design enters the story—not AI in the mystical sense, but AI in the “we simply have too many options” sense.

Systems Designing Systems
Modern chip fabrication has crossed the threshold where human-guided design is insufficient. Tools from companies like ASML, Synopsys, and Cadence now help generate extreme ultraviolet light lithography circuit patterns no human could have conceived unaided. Today’s lithographic masks—so intricate they resemble mandalas drafted by caffeinated ants—are themselves produced by algorithms exploring billions of configurations.
There is nothing supernatural about this. It is simply the inevitable mathematical truth that an engineer with a pencil cannot outmatch a system that evaluates thirty million variations before lunch.
The design loop works like this:
- Engineers set constraints, goals, and forbidden pathways.
- Algorithms explore the vast middle space.
- The system spits out something far better than anything we would have drawn.
- Humans nod solemnly and realize that they are now piloting an experimental aircraft.
This shift is philosophically strange in the same quiet way that hybrid chips are strange: not frightening, not dramatic—just new.

The Geography of Power
All this computing generates heat, and the machines that keep the universe flowing through our servers consume hundreds of megawatts. You cannot run these systems off a municipal grid designed to keep supermarkets refrigerated and porch lights glowing.
Power transmitted over long distances wastes energy as heat. Electricity leaks into the air, humming along wires like an exhausted commuter. The solution is obvious: build the data centers next to the power sources. Not near them. At them.
And so, the future of astrophysical computing—and AI computing, and every other computing that ends in “-scale” (hyperscale, exascale, megascale)—is being built in places that once specialized in cattle, cotton, and very long horizons.

West Texas: The New Nerve Center
Near Abilene and Amarillo, where wind turbines stretch across the plains like rows of white tuning forks, Alphabet/Google has built new facilities designed to inhale vast amounts of renewable power directly from the generation source. No transmission losses. No detours. The servers eat wind for breakfast.
Beside them rise the most quietly revolutionary power systems of the decade: gravity batteries.
Not lithium. Not molten salt. Not vats of chemical slurries. Just cranes, pulleys, motors, and gigantic concrete blocks the size of small garages, an engineering feat that would have impressed Archimedes and baffled Edison.
When excess power is available, the blocks are lifted. When power is needed, they are lowered, converting potential energy into electricity like an industrial-scale grandfather clock. The engineering is deeply satisfying—brutally simple, almost rude in its refusal to require rare earth elements.
And it works.
These systems flatten out demand spikes that would otherwise cook entire substations. They balance load without combustion, spills, emissions, or quantum-entangled marketing promises. They lift heavy things and set them down again, which West Texas has been doing for a very long time.
Thus, in what future historians will describe with delight, the post-industrial computing revolution now depends on technology found in 18th-century Black Forest cuckoo clocks.

The Unintentional Environmental Elegance
The universe is frugal; it seems fitting that its interpreters must be, too. One understated triumph of this architecture is that it is sustainable without trying to be. The motivation is not virtue but physics:
- No long-distance transmission means no line loss and no contentious right-of-way battles.
- No chemical batteries mean no rare-earth supply chain nightmares.
- Less waste heat from electrons means fewer cooling towers.
- No combustion means no carbon emissions.
- Gravity batteries function as natural backups for grid demand spikes.
The telescopes gather the light. These new machines make sense of it rapidly enough for us to learn something new. The elegance is almost accidental. The engineers wanted efficiency. The planet benefits incidentally.

IV. Coda: Data Center First Light
Astronomers speak of first light with a particular reverence. It is not the day a telescope is assembled, funded, or blessed by committees. It is the moment the shutter opens and the instrument—finally—sees. Rubin had its first light in June 2025. Roman will open its eyes in 2027. These are the moments that earn photographs, ribbon-cuttings, and a thin dusting of ceremony.
But there is a quieter, third first light, almost never named and rarely noticed.
It happens inside the exascale facilities—the unglamorous concrete halls humming beside wind farms and high-voltage substations on the plains. No crowds. No countdown. No commemorative plaques. Just the instant when the machines accept their first real torrents of sky-data, run a full chain of queries from end to end, and return their first pattern. A kind of illumination without photons: not seeing light, but making sense of it.
Nothing flashes. No arc lamp sputters. The “light” here is a completed thought.
This third first light completes the arc.
A neighborhood bookstore might hold fifteen thousand volumes, the entire reach of one life’s imagination arranged on shelves. A modern observatory producing sixty petabytes a decade stretches that scale to the mythic—millions of skies’ worth of pages arriving faster than anyone can read them. And then comes the final necessity: an architecture not only capable of receiving the flood but of interpreting it, shaping torrents into meaning before they rush past.
Each first light demands its own structure of access. The bookstore needed aisles, a ladder, and a patient proprietor. The observatories needed mirrors, sensors, and mountaintop air. These data centers need power drawn at the source, networks braided with light, and machines quick enough to notice what no human eye can catch in time.
Their illumination is not visual; it is functional. This is first light as capability—the moment the entire apparatus, from a telescope’s quartz-coated mirror to a photonic interconnect in West Texas, becomes a single, continuous instrument for discovery. And what does this infrastructure make possible?
It allows us to witness the sky in motion rather than as a frozen postcard. It lets faint, transient events be caught before they vanish. It makes the universe not merely observable, but legible.
A century ago, first light was a village affair: families climbing a hill to watch a telescope turn skyward. Today, one form still happens on mountaintops, but another unfolds in a windowless room in West Texas, where a rack of servers recognizes a pattern no one has ever seen and quietly logs it.
No applause. No shutter-click. Just the subtle moment when understanding begins.
The lights never blink. But the world, in that instant, becomes a little brighter.
