No. 220: Convergence in Advancing AI Technology

Over 50 years ago, Gordon Moore, one of the founders of Intel observed that the number of transistors computer chip makers were able to fit on a chip doubled every two years. Moore’s Law held true until the past couple years as the amount of power these chips need to compute becomes unsustainable and their computing power starts to level off. So, software researchers and hardware researchers have been thinking of new ways of increasing computing power that is more energy efficient. As someone who lives to see a good connection, I was thrilled to come across an opportunity for these two efforts to meet.

Deloitte’s Applied AI Leader, Melissa Smith, posted the below tweet that got me thinking about this.

When I was a kid, my mom would let me look at bacteria using the electron microscope in her laboratory. Shoutout to North Carolina A&T State University for investing in its laboratory capabilities. It was incredible to see extremely small organisms with such clarity. But, I could only observe these organisms in isolation. I couldn’t see how they interacted with organisms in the past, let alone predict how they would depending in changes in the environment. Things have changed over 20 years later. Electron microscopes allow researchers to examine nanoparticles with incredible accuracy.

There’s a whole field of science called quantum physics that describes the fundamental elements of the universe – photons, electrons, etc. The paper Melissa linked to is part of an effort to equip researchers with the tools to simulate and study the interaction of lots of particles and make predictions about phenomena like gravity.

The researchers figured out a way to use neural networks, basically algorithms designed to try and mimic how our brains identify patterns, to simulate the interaction of a bunch of particles, or quantum systems. The researchers did this because of the limits we are hitting in computing power.

Kunle Olukotun is a professor at Stanford who changed the game for computing in the 90s and early 00s. Back then, processing speeds were leveling off similar to what is happening today. He introduced the multi-core processor which kickstarted another wave of speed increases. Today, Kunle and his team at SambaNova Systems is building new software language and hardware flexible enough for artificial intelligence applications to run at scale. The talk below gives a nice overview of what the SambaNova team is building.

SambaNova hasn’t released its products yet, so for the time being the neural network solution the researchers came up with works. It will be really cool to see the impact a software and architecture able to handle quantum systems simulations has on their research. That convergence could be really powerful. I, for one, would love to see how much has changed from when I would ask my mom if I could look at something in the electron microscope.

Published by

Kwame Som-Pimpong

My name is Kwame Som-Pimpong. This is my blog. You can email me at kwame.som.pimpong@gmail.com.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.