Real-time Experiments with an AI Co-Scientist - Stefania Druga, fmr. Google Deepmind

Channel: aiDotEngineer

Published at: 2025-07-28

YouTube video id: wNH3q9pqn0U

Source: https://www.youtube.com/watch?v=wNH3q9pqn0U

[Music]
My name is Stefania. I'm so glad you
made it until the yeah uh last day of
the conference and came to the robotics
track. So, we're going to start with a
live demo uh and then we'll switch to
the presentation just like to to kind of
like swap things around. So, I'm going
to try to connect the microscope over
here. Uh and let's see the other camera
and some sensors. So, my talk is about
real time uh science co-scientist. So,
think about pair programmers. How many
of you use any form of copilot for
coding? Right. So it's just like that
but for doing things in the real world
like science experiments. So you can see
here now I have this board which is a
microbit board with jackd connected and
it's measuring the temperature and I
actually have a heat pad so I can
increase the temperature and make it
very very hot and hopefully it's not
going to melt the board. Uh and then I
can send that to my uh science
assistant. It's going to analyze the
data and kind of give me an answer in
real time. So it's it's telling me like
okay the ambient conditions indicate
like stable dark quiet environments um
because it has like it knows it can
measure all these different sensors no
buttons um but it sees that the
temperature is 26 degrees and then I can
actually give it more context I can go
and create a protocol and say like this
is the type of experiment that I want to
do um so whenever you give me feedback
about the data or the images that I'm
sending from the microscope which I can
also do um it's going to in the context
of that experiment with those conditions
and those constraints. And of course,
like if you are actually doing that
experiment several times, you can go
ahead and create a custom page for your
experiment. Um, and it will monitor the
data from the we're going to connect
again the Jack DAC. Um, it's going to
monitor and plot the data in real time.
And the cool thing is also that if you
want to leave your experiment running
and go away, right? Like you could
actually have cameras that are
autonomous to go back to the previous
talk. They're not fully autonomous, but
they can track objects. So this camera
is called REC camera. Also, by the way,
all the hardware I'm showing you and all
the components are open source. So,
right now, uh, I'm going to set it to
track me as a person
and start the tracking.
And it's running a model on the camera
itself. So, it's going to start like
moving around to see me and see where
I'm going. Um, but you could train a
custom model like to track like crystal
growth or specific objects that you want
to monitor and in real time. And this
runs on Wi-Fi, so it can be you can
place it anywhere. Um, and then control
like the the conditions in the
experiments like increase the
temperature or decrease the temperature.
So, oh, it sees all of you. That's
awesome.
Um, so I'm going to stop this for now.
So, that's like a short and here we're
seeing like our temperature kind of
increasing. I think my heat pad stopped.
Um, but that's just like a a simple like
demo uh of like without having a
specific experiment in mind. And now I'm
going to switch to oops uh my slides and
actually show you what happens when you
record much longer experiments and what
it can do. Um why do we care about
coscientist? We're not talking yet about
real time uh co-scientist just co
coscientist in general like why should
you care about it we know that there's a
a data overload in science and there's a
lot of complexity a lot of things to
parse through and analyze and AI can
really help with that um we also know
that beyond the analysis of data fast at
scale they can help with generating new
hypothesis and maybe identifying blind
spots in prior work and in your own
thinking and We also know that it can
they can speed things up, right? So
instead of testing one hypothesis at a
time, you could test a 100 hypothesis.
So I got inspired to build this demo. Uh
and by the way, the cost of all the
parts uh is under $300 and it took me
two weeks to build it and it's open
source so you can play with it and hack
it. Um so I got inspired to do this
because of this demo from Deep Mind. So
this paper got published uh two months
ago on AI coscientist and it was
actually showing um not in the real life
but it was actually showing what happens
when you're analyzing papers and data.
If you have a multitude of agents that
can perform the different roles that we
do as scientists what are the results.
So what are the roles that we do as
scientists? like we analyze papers, we
summarize them, we look at data, we rank
the options, we rank the different
hypothesis. So they actually created uh
an orchestration of different agents.
Each of them was actually working on
Gemini Gemini 2.0 Oh, and then uh with
this orchestration of like ranking the
different results, the different
hypothesis, doing search online, doing
search on papers, come up with a plan
for the researcher to use.
And not only that they did that, but
they actually tested it against prior
discoveries. And this was super
interesting. So they tested it against
prior discoveries on gene transfer
mechanisms. So a discovery that took
scientists 12 years. Uh it took the AI
scientist two days to come up with
without having seen the data. So it it
was not aware of how the gene transfer
works. This was not part of the
training. Um so and that was one
verification by trying to replicate past
results. But another one was like to
come up with completely new hypothesis.
So they used it for liver fibro fibrosis
target treatment discovery and the AI
coscientists came up with target uh uh
drugs that were actually efficient in
the lab in wet lab. They were tested by
experts. So it's not a science fiction.
It's not like something in the future.
It's actually happening now. Um and for
me what was inspiring about this is like
when we can make real discoveries in
drug drug discovery like um healthc care
treatment bacterial resistance new type
of materials I think that can be
accelerated
doing it uh in real time with the
scientist.
This clicker doesn't work very well. So
the vision was instead of just doing it
like o on the data like not sync like
async and giving the researchers a plan
based on the data that exists. What if
we do it in real time right and we
formulate this hypothesis based on the
empirical data we're observing in the
lab in real time or it could be like
when you're observing the robot like
breaking down in real time. Right? So
what motivated me was the results from
the AI coscientist but also this vision
from silver and sutton which is they
they just published this paper welcome
to the era of experience which is
fantastic it's very short if you haven't
read it I highly recommend it and they
really talk about how we're going past
the era of human data where we're only
indexing and making predictions but uh
based on the data sets that we created
and go into an era where the AI learns
from the continuous environment in which
we operate, right? And especially with
multimodel when we have real time data
from images, from sensors, from audio
streams like that is possible.
So I hope I convince you by now that
real time matters. And I wanted to show
you like how I use my system with longer
experiments. So, um, you already saw
like the the chat image and I had to
find experiments that I could actually
do at home. So, this is the overview of
the of the system. So, it's it's a very
simple React app and it's has like all
the input sources that you've seen live
in action like the Jack DAC sensors via
USB, different webcams. You can add as
many webcams as you want. Text input. It
actually works with voice. I forgot to
show that. you can talk to it and it
talks back. Um, and then uh all of these
like inputs become web hooks and I can
I'm going to go into more detail how I
optimized each and one of them and that
gets sent to a back end which
communicates with the real API. I use
the Gemini in this case and
I had to uh this is the information
flow. So um you have the physical
sensors and for the jackd I'm actually
using the web USB API um that goes to
the fronted hooks which gets pulled by
the context assembly and the context
assembly actually is always checking
what sort of modalities are present. Do
we have text? Do we have voice? Do we
have image? Do we have a chat history?
And depending on what modality
modalities are present, every single
time uh I'm sending a message, it builds
this context that gets sent to the API
that gives me back a response. And uh
this is how the unifi context assembly
works. So I guess like the the very
important piece here is how you're doing
this con uh context injection depending
on what sensors you have connected and
the type of experiment you're defining
and making this dynamic on the fly like
when you create the protocols
and
that was the the ingredients for the
code. Let's now let's talk about the
ingredients for the hardware. So when I
start doing the the experiments, I had
to list and put out all the parts that I
had at my my disposal. Kind of like
cooking, right? So it's like what can I
make with all these sensors? Like these
are my inputs. These are my outputs. Uh
these are my all my cameras and all my
cables like and how many boards I have.
What sort of experiments could I create
with that? And
I also had the constraint of doing
experiments that I can measure in real
time for this talk. and also that are
safe for me to do at home and I can
travel with, right? So that really
narrowed down the list of things that I
could play with. So one thing that I did
was exploring crystallization. So you
can see there like the crystals that I
was growing and that took longer and I I
recorded the whole thing and
fermentation.
And let's dive into each and one of
those. So this was my lab in a box in
our apartment in Tokyo. Um the people
coming to the house were very confused
every day.
Um there was something no explosions
though so that was good. Uh so for for
crystal growth how many of you re
remember this from chemistry. Okay. Okay
that's awesome. So I don't need to go
into great detail but basically the the
principle is quite simple like you
oversaturate the solution by adding salt
in hot water and then you're like slowly
cooling it off and that creates this
process this uh nucleation and the
growth of crystals. Now the trick in
having beautiful crystals is how you do
the cooling off right it needs to be
gradual. you need to not move like the
um the liquid or the object. So the
gradual cooling and the level of
humidity is what gives you beautiful
crystal formation. So the main things we
want to measure in this example is what
are our curves for how fast the salt
dissolves like where are the places
where the crystals are being formed?
Those are called nucleation sites.
What's the growth rate of the crystals?
And um that is affected by temperature
concentration.
And oh yeah, I forgot to play this. This
was like a sped up. So the this was a
recording from the microscope. It's
moving because it actually has a fan uh
uh uh blowing cool air from ice. So
that's why like the camera is moving.
That was how I was like cooling it off.
But you can see the data in the sensors
changing on the side. And this is like
being recorded for an extended period of
time. And then I get a CSV from the the
sensor values and I can go and analyze
that um and kind of get a sense for like
what happened. Was I cooling it off too
fast or too low too u slow? Um were my
crystal like growing or not? And then I
also played with different samples. So I
had samples that I would put in the
fridge and in the room and like at
different temperatures so I could also
measure and have like control groups for
that. And then once I have that CSV
data, I can plot it, right? So I can
actually see what was my crystal growth
rate um and if my temperature was like
the best and I can plot that for the
different samples. And right now I don't
have that integrated live on the
platform but that's coming. Um so I had
to write a separate script to get the
CSV data and create this visualizations.
Um and the insight I got from this is
that the crystal formation is actually
uh not gradual that it happens in
bursts. Um so that that was something
that surprised me when I looked at the
data. So once the critical saturation is
reached there's like sudden crystal
growth formation. So it's not like
something that it's kind of like slowly
growing but uh it grows in bursts. Uh,
and here's like 40 minutes recording of
the crystal growth like super sped up.
So, you can kind of see how the humidity
and the temperature was changing.
And
the the next one was am I doing on time
because I have a lot of things. Okay.
Uh, I might skip over fermentation just
in interest of time. But um I'm sure all
of you have dealt with fermentation in
some shape or form even if it was liquid
having to benefit the the out uh the end
result. Um, but the the insight here is
to actually control how much salt and
sugar we put into the different uh dough
and then measure and and then also
change the temperature and measure the
growth rate uh and the CO2 and uh that
was the fermentation and I also have
like a sped up recording of the data
collection
and uh I already showed you the re
camera and the fact that it's mobile and
it and track objects. Um, this is the
education version. If you want to play
with some of these things and just kind
of do your own experiments. Uh, please
like the lab mine that I showed you is
not deployed yet, but it will be and the
code will be open source. But this you
can play with. You need to put your own
API,
but it runs on the phone and you can
test the camera for now. Uh, and if you
have a microbit or you want to connect
mine, um, you're welcome to to try it
and let me know if the QR doesn't work.
And
yeah, this this is a version kind of
like for to teach it to kids. Um, and I
wanted to end by talking a little bit
about the open-source ecosystem that can
support this type of initiative to go
beyond the demo and kind of like a cool
experimentation to a real uh real
solution in uh for scientists and
builders. So there's an entire
opensource ecosystem for recreating all
the lab equipment but then also for
creating machines that can support the
automation of pipetting and like doing
all the um analysis that you need to do
in a lab and all the manipulation that
you need to do in a lab. The Jubilee
motion platform it's actually coming
from University of Washington where I
did my PhD. It's open source. Um, and
there's a open bioreactor and there's
actually a workshop that just took place
in April at Udub where they had people
for a week hacking on creating different
solutions for automating uh um
scientific experiments in the lab. So
anything from like droplet manipulation
to robot handling liquids and mixing
like vials and things like that and
other application with Jubilee. So
definitely check them out.
And for the future, uh, this is my last
slide. You can imagine that
based on the samples that we're
collecting from this cameras and sensors
and voice, it's much easier to create
simulations. So we need we not don't
need to be limited by the experiments
we're doing in real life but those
experiments that we're doing in real
life are going to inform realistic
simulations right so I could actually
create a simulation of my crystal growth
but more useful stuff right like
bacteria colony growth and run those
simulations with the lab conditions and
then identify what are the best
conditions for the experiment and create
those conditions in the real life. So
very excited about integrating
simulation and I think that's where the
this is going. Uh that was me. Uh if you
want to read more about uh my projects
and the paper and all the open source
projects. I do a lot of work in
education as well. It's all on my
website. And as Ben mentioned this
morning, we are going to do an AI
education summit and very excited about
that. Uh some of you who saw my talk
like in New York or last year know how
much how passionate I am about
education. Um so I hope you can join us
for the summit. I really appreciate you
coming to the talk today and uh I'll be
around. Thank you so much.
[Music]