Hi everyone!
Is it just me or are the weeks flying by on coronavirus-time? As working from home often feels like living at work, rest assured that the challenge today builds strength for tomorrow.
Surround yourself with people who are working on cooler projects than you.
Wolfram Research holds an annual summer program at Bentley University in Massachusetts, offering students a uniquely immersive experience with full-day lectures and mentor-guided research projects at the intersection of theoretical physics, computation, and innovation.
I was accepted into the program after a live-coding interview to study alongside a small cohort of energetic science/tech wizards.
(Due to COVID-19, this year’s program was fully-held online, with an intense schedule that ran from 6am to 9pm.)
At our first Zoom meeting introduction, when a fellow student was asked what he was working on, he casually replied, “Simulating matter at the atomic-scale with a team of scientists using one of the world’s most powerful supercomputers.” 😲
These encounters and conversations were the most memorable gems from the program – the drive/ambition from the collective was contagious.
I had the opportunity to speak with Stephen Wolfram on several occasions while working on an implementation for graph representations of threaded conversations within the Apache Software Foundation’s software development mailing list server.
(Watch this talk: Graph Databases Will Change Your Freakin Life.)
Meet InboxGraph
View InboxGraph:
https://community.wolfram.com/groups/-/m/t/2026483
View WSS 2020 projects:
https://wolfram-school.tumblr.com
In my “20% fun time,” I crafted this Wolfram Physics Project Sierpinski gasket rule:
and this auditory fractal using an inverse spectrogram:
Listen here.
Phyte For Your Life

I gave a small nutrition-themed talk this month, scratching the surface for a larger biochemistry/food-science space that I hope to explore in further detail.
While many popular diets focus on sources for macronutrient ratios, I’m proposing an approach with a focus on phytonutrient/myconutrient diversity for epigenetic health.
View slides
The Art of Code
Dylan Beattie gave a wonderful presentation on the art of code at the NDC conference in London.
Software and technology has changed every aspect of the world we live in. At one extreme are the ‘mission-critical’ applications - the code that runs our banks, our hospitals, our airports, and phone networks. Then there’s the code we all use every day to browse the web, watch movies, create spreadsheets… not quite so critical, but still code that solves problems and delivers services.
But what about the code that only exists because somebody wanted to write it? Code created just to make people smile, laugh, maybe even dance? Maybe even code that does nothing at all, created just to see if it was possible?
I’m reminded of this Easter Egg in the HTTP response code specification, and this “LOLWUT” function in Redis.
LOLWUT wants to be a reminder that there is more in programming than just putting some code together in order to create something useful.
My favorite portion of the talk was on quines, a term coined by Douglas Hofstadter in the book Gödel, Escher, Bach, in honor of the philosopher Willard Van Orman Quine.
https://en.wikipedia.org/wiki/Quine_(computing)
View on YouTube:
https://youtu.be/6avJHaC3C2U
View Quine Relay on GitHub:
https://github.com/mame/quine-relay
Lifelogging with a VR Headset
Like an episode from Black Mirror, using an archive of first-person camera captured lifelogging footage and a VR headset, Lucas Rizzotto was able to experience what it would be like to have perfect digital-recall of any past event.
He reports:
“The moment you see through your eyes again, your brain lights up and you remember everything connected to that moment. You don’t just see the memory portal, you see everything around it.”
“Those moments didn’t feel like memories. They felt like gifts made by a past version of myself delivered across time, and upon receiving them, all I wanted to do was to return the favor – to shut down the time machine and go out into the world to create a future that’s impossible to forget.”
View on YouTube:
https://youtu.be/aHyNYfFfXlg
GPT-3 An Even Bigger Language Model
Last year, OpenAI achieved state-of-the-art performance in many natural language processing tasks with a very large transformer-based language model. They released the complete version of the pretrained model just last November. In the short time since, they’ve developed a new model which isn’t twice as large as the previous, or 10x as large, but rather, 100x larger.
The progress in AI research continues to blaze beyond Moore’s law.
In a handful of samples, this neural network is capable of generating text which is indistinguishable from human-written text, while expressing an emergent “few-shot learning” property.
Given a large-scale corpus of content from the web, this unsupervised model seems to be learning about more than just our collective web musings. It seems to be at the beginning stages of learning how we learn.
View on YouTube:
https://youtu.be/_8yVOC4ciXc
Thinking back to GEB’s quines, I wonder what we would see from a large language model trained on a gigantic corpus of open-source code rather than text.
With a generative adversarial network compression ratio fitness function modeling auditory/visual interactions while guiding the model to rewrite its codebase, it seems like we would be on a steep trajectory toward self-generalizing agents. 🤔
That’s it for now!
“A problem is a chance for you to do your best.” - Duke Ellington