Hey everyone!
As we settle into 2025, I've been reflecting on how wild this tech journey has been for me. It's crazy to think about how much we've built since the early days. Back then, who would've thought we'd get this far this fast?
I grew up as a '90s kid. I was there to witness the birth of the internet and the dotcom boom. I was there when we leaped from late '80s BBS terminals into CompuServe, from AOL to cable, and from cable to fiber networks.
I was there, searching for slow-scan TV images through my grandfather's ham radio antenna, tuning by ear, before fax machines were briefly "a thing." Since those humble beginnings when I walked the aisles of RadioShack for parts, the world of tech has advanced tremendously. Now, we're generating 4K videos from photorealistic world models trained on the collective digital footprint produced by all of humanity, at a price point that's approaching free.
How Did We Get Here?
Flashing back to when I was really young, maybe six or seven, we had this old DOS-based i386 PC at home. My parents got it for bookkeeping in their gift shop, but for some brave reason, they let me use it. Whenever no one else was on it, I had time to tinker.
On that black-and-white console, I created my first-ever program. It filled the screen with obnoxious scrolling text and chaotic beeping sounds, all depending on which family member entered their name — that machine taught me the basics of algebra before my first grade-school lesson on the subject. Sometimes, if I managed to sneak the script open while someone was working on something important, it would cause the machine to enter an endless loop that required a manual reboot. Mischievous? Absolutely. But to me, it was the greatest thing ever. It was the beginning.
I was hooked.
Getting Hyped
HyperCard landed on our family Macintosh, which was primarily used by my dad for music production back in the days of SampleCell ROMs. This was just as Mac OS 7 was coming out, long before Classic OS 9 or the ground-up rewrite into OS X. It gave me my first taste of bringing ideas to life through software. It felt amazing to have a voice through programming, as crude and spaghetti-based as it was. I found myself.
I was there, exploring the early web, a wild frontier. I surfed in on Netscape Navigator, talking to random early adopters using early versions of Internet Phone, long before Skype.
I dove into hidden corners of Hotline and IRC to discover tools like MacsBug, ResEdit, and Resourcer. MacsBug was a low-level debugger that allowed users to troubleshoot and analyze software crashes. ResEdit and Resourcer were resource editors that let me peek under the hood of Mac OS applications. These tools allowed me to manipulate system resources and code in ways most users never would.
I'd modify shareware resource forks, altering the embedded data and settings of software before the dawn of mods and skins. I'd force authentication switches to JUMP to places in code branches they weren't supposed to reach (*for educational purposes only, of course).
I took this UI library called Aaron (which skinned Mac OS 7.5 to look like Mac OS 8.0) and hacked it to look like BeOS. My Mac looked like no one else's at the time.
These experiments weren't just about personalizing my computer; they ignited a passion for programming and software customization. My tinkering with HyperCard and these tools led to an internship at the local college, where I published nationally-distributed educational software in ActionScript. Those deep dives into security eventually led to contracting NIST-grade zero-day SecOps with Apple.
As I grew older, the internet evolved too. From dial-up connections to gigabit lines, from static HTML pages to peer-to-peer blockchains, it's been an incredible journey. I remember the excitement of getting our first email accounts, exploring GeoCities and Angelfire websites, and chatting on Palace before "blogging," let alone "microblogging," was even a word.
I remember HotBot before Google came out, MySpace before Facebook. Heck, LiveJournal before MySpace. I was there.
Back then, the sense of community online was different. There was a certain innocence and openness that seems rare now in the age of social media giants and constant connectivity. Even then, there were hints of the challenges we face today: privacy concerns, information overload, and the struggle to distinguish truth from fiction.
The Pursuit of AGI
I was always dreaming about the big stuff: Artificial General Intelligence and the singularity. I'd get lost pondering digital consciousness and digitized free will, not just philosophically, but very practically. Like, "Okay, how do I even begin to describe the first steps toward that in pseudocode? What does the most basic, fundamental version look like? What's the right model for even beginning to think about thinking about this?"
Back then, AGI felt like pure science fiction. Talking about wanting to work on it was a quick way to get laughed out of the room; it was like saying you're going to work on perpetual motion. So, I took the "practical" career path, as practical as it could be, growing up in a family of artists.
I witnessed firsthand how rapidly the music industry transformed. The shift from physical media to digital downloads and then to streaming reshaped how artists create and share their work. With each transition, losing more connection to the audience and revenue. It taught me about adaptability and the relentless pace of innovation.
My passion for visual aesthetics led me to graphic design, but I soon realized that creating interactive experiences was even more fulfilling. This drove me to explore product design, where I could combine my design skills with an understanding of user needs.
Curiosity and a desire to build more complex systems pushed me further into the realm of engineering. Before I knew it, I was an engineering team lead reviewing pull requests for a decentralized finance startup. Crypto was exploding, and I found myself consulting on everything from search & analytics to DeFi and smart contract primitives. It was exciting — chaotic, but in the best way.
But despite all the freelance gigs and career changes, I never stopped dreaming about AGI and what it could mean for the future. It's one of those ideas that just sticks with you.
So... About Now
And now, in 2025, it's like the impossible is becoming possible. Desktop supercomputing is becoming a reality. We've collectively broken the four-minute mile—something once thought unattainable is now within reach. It's not just the big tech companies diving into AGI; the potential is there for everyday people too.
This realization has made me recalibrate my own timelines. I used to think that OpenAI or some other major player would crack AGI in the next few years. Now, I'm realizing it's going to run on my own local network.
Our past gives us a trajectory, and it's incredible to see where it's leading. The idea that AGI could soon be accessible in our own homes? Mind-blowing. It's here, and soon, I might be able to get you a copy.
What's Got Me Excited Right Now?
Out of all the news about 2025 releases, I’m most excited about Project DIGITS from NVIDIA.
Just recently at CES 2025, NVIDIA unveiled something incredible: a stackable GB10 Grace Blackwell personal supercomputer. It's designed for developers and data scientists, but honestly, it's going to shake things up for all of us working with AI. Starting at just $3,000, it's wild how accessible serious AI training is becoming.
What does that mean? It means we'll be able to prototype, fine-tune, and run massive AI models locally. Opt for a dual setup with 256 GB of unified memory, and you're looking at handling models up to 400 billion parameters. That's supercomputing right there on our desks, folks.
Everything we've been dreaming about is soon within reach.
Here's to forging into the future while tracing back our roots. I can't wait to see where this journey takes us next...
Cheers!
🍻Ryan