Team:EPF Lausanne/Hardware

From 2014.igem.org

(Difference between revisions)
 
(48 intermediate revisions not shown)
Line 128: Line 128:
<h1 class="cntr">Hardware</h1>
<h1 class="cntr">Hardware</h1>
-
<br/><br/><br/><br/>
+
<br/><br/>
Line 134: Line 134:
-
With the living organisms now responding to touch, we have a signal that we need to retrieve and then analyze to make use of it. One simple and fast way to do it is through a computer.<br />
+
Having successfully engineered touch responsive bacteria, the next major step to build a functional BioPad is to detect and process the emitted signals. The microfluidic chip containing our engineered organisms has the advantage of being small and portable. We aimed to keep these characteristics all along the project. That is why, instead of using a big and cumbersome device to detect signals, we opted for a small and cheap Raspberry Pi. Let the adventure for building the BioPad Detector begin!<br /><br />
-
Because our microfluidic chip is small and portable, we wanted to keep this trait all along the project and that's why instead of using a big and cumbersome device we opted for a Raspberry Pi.<br /><br />
+
</p>
<h3 id="raspberry">Raspberry Pi</h3>
<h3 id="raspberry">Raspberry Pi</h3>
-
 
<a href="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" alt="Raspberry Pi" class="pull-left img-left img-border" style="width: 30%;" /></a>
<a href="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" alt="Raspberry Pi" class="pull-left img-left img-border" style="width: 30%;" /></a>
-
<br /><br />
+
<br />
-
<p>The Raspberry Pi is a cheap (40.-) and small single-board computer that we are going to use in order to monitor the light in each of the chambers of the microfluidic chip.<br /><br /></p>
+
<p>The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. The raspberry Pi will be used to monitor the light emitted by each chamber of the microfluidic chip. We will be able to detect and process all emitted signals through this small device!<br /><br /></p>
<div class="clearfix"></div>
<div class="clearfix"></div>
-
<br />
 
-
<br />
 
-
<p>This is what the initial device looked like: <br /></p>
 
-
<div class="cntr">
 
-
<a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border img-responsive" /></a></div>
 
 +
<p>When brainstorming how to build the detector, we initially drafted the follwing setup: </p>
 +
<div class="cntr">
 +
<a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border" style="width: 70%" /></a></div>
<br />
<br />
<br />
<br />
-
<p>In order to track the light we need a good camera that is cheap and allows us to be flexible.<br /></p>
+
<p>As seen in the picture above, the camera is directly linked to the Raspberry Pi and is able to get a clear view of the whole chip. Our final device thus needed a small and high-resolution camera able to easily track signal emission from our touch responsive organisms (including signals emitted in the infrared spectrum). <br /></p>
<a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a>
<a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a>
-
<p>We chose the Raspberry Pi NoIR, NoIR stands for No Infrared Filter meaning that with this camera we are able to see near infrared wavelength.<br /></p>
+
<p>The camera best suited to the characteristics above was the Raspberry Pi NoIR. The Raspberry Pi NoIR is a 5 MegaPixel, 1080p, 20mm x 25mm x 9mm CMOS camera. It is especially good for low intensity signals. Moreover, NoIR stands for No Infrared Filter, meaning that with this camera we are able to see near infrared wavelength.<br /></p>
-
<p>Near infrared correspond to wavelength between 700 and 1000nm we will use this feature of the camera to track the IFP. Moreover getting our signal in near infrared instead of the visible spectrum is a nice feature because not a lot of things emit near infrared so if we see something we can be quite sure that it's our bacterias, whereas if we only rely on the visible spectrum and we have no depth information, it's very hard to tell if it's our organisms emitting light or something else with similar color / light intensity.<br /></p>
+
<p>The near-infrared spectrum correspond to wavelengths between 700 and 1000nm. We will use the ability of the Raspberry Pi NoIR to detect near-infrared to track the IFP signal emitted by our CpxR - split IFP1.4 stress responsive cells. The emission of IFP in these wavelengths is especially useful for us, as few things emit auto-fluorescence in the infrared spectrum. This drastically increases the precision of our device as it reduces background noise.<br /></p>
-
<p>The best plan is then to get the spectrum up to near infrared and then use a filter to get rid of the visible spectrum.</p>
+
<p>Taking into account all the information above, the main idea driving the way we plan to detect signals through our BioPad detector is to collect the entire light spectrum including near infrared wavelengths and then use a filter to eliminate the visible spectrum.</p>
<br />
<br />
<div class="cntr">
<div class="cntr">
-
<a href="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" alt="Raspberry Pi Plan" class="img-border img-responsive" /></a>
+
<a href="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" alt="Raspberry Pi Plan" class="img-border" style="width: 70%" /></a>
</div>
</div>
<br />
<br />
-
 
<h3 id="lighttracking">Light tracking</h3>
<h3 id="lighttracking">Light tracking</h3>
-
<p>In order to view the light change in a chamber we need to track the light intensity.
+
<p>To track the signal dynamics in the chambers via our detector, we wrote a custom C++ code using OpenCV able to specifically detect the exact position of the signal as well as its nature and intensity. The entire code as well as all the supporting files can be found here: <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>. An extract of the main code is given here:</p>
-
For this part we made a custom C++ code using OpenCV (you can find the entire code there <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>)</p>
+
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script>
-
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script>
+
<br/><br/><br/>
-
<p>Here's the result output:</p>
+
<p> Check out the result of our program here:</p>
<p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p>
<p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p>
 +
<div class = "cntr">
 +
<video width="500" height="300" controls>
 +
  <source src="https://static.igem.org/mediawiki/2014/f/f6/Trackinglight2.mp4" type="video/mp4">
 +
</video>
 +
</div>
 +
 +
<br />
 +
<br />
-
<p>The most common color space that program uses is RGB where the pixel color can be split into three components Red, Green and Blue with values between 0 and 255.</p>
 
-
<p>But there exists a lot of other color space that can be used for different applications, in our case we are interested in light intensity and not really in colors.</p>
 
-
<p>In our code we use the YCrCb color space, it decomposes a pixel into three components Y: the luma value (the intensity), Cr is the red difference and Cb is the blue difference.</p>
 
<div class="pull-left img-left" style="width: 30%">
<div class="pull-left img-left" style="width: 30%">
Line 199: Line 200:
<figcaption class="cntr">Representation of the YCrCb color space</figcaption>
<figcaption class="cntr">Representation of the YCrCb color space</figcaption>
</div>
</div>
 +
<p>To detect the signal, we need to be able to get the information that we need from the pixels. The most common color space used in programming is RGB - pixel colors can be split into three components (Red, Green, and Blue) each taking a value between 0 and 255.</p>
 +
 +
<p>We used another color space, better adapted for this application as we were especially interested in light intensity and color nuances. We therefore chose the YcrCb color space.</p>
 +
<br/>
 +
 +
<p>In the YCrCb color space, each pixel is decomposed into three components: </p>
 +
 +
<ul>
 +
<li>Y – the luma value (intensity)</li>
 +
<li>Cr – the red difference</li>
 +
<li>Cb – the blue difference</li>
 +
</ul>
 +
 +
 +
<p>
 +
This allows us to extract the necessary information for our application.
 +
</p>
 +
 +
   
   
<h3 id="lenses">Lenses</h3>
<h3 id="lenses">Lenses</h3>
-
  <p>The Raspberry Pi camera that we use has a fixed lense that is not really suitable for us because we can't change the focus, the aperture or the zoom.</p>
+
  <p>The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.</p>
-
<p>We searched for a way to change the lense and have the more control possible on it and we found that the easiest way was to remove the initial lense, put a CS mount on it and attach a much bigger lense.</p>
+
<p>We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.</p>
-
<p>We found some cheap lense online (~45$) and we did the mounting.</p>
+
<p>The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted. </p>
-
<p>The first thing to do is to unplug the camera module from the PCB, once it's done you need to remove the lense carefully by unscrewing it.</p>
+
<div class="cntr">
-
 
+
<a href="https://static.igem.org/mediawiki/2014/c/c7/IMG_3936.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/2/23/Cam.png" alt="Camera" class="img-border img-responsive" /></a>
-
INSERT IMAGE
+
</div>
 +
<br />
 +
<br />
<p>You will then see the camera sensor (CMOS).</p>
<p>You will then see the camera sensor (CMOS).</p>
 +
<div class="cntr">
 +
<a href="https://static.igem.org/mediawiki/2014/5/5f/IMG_3937.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/1/17/Naked_cam.png" alt="Camera" class="img-border img-responsive" /></a>
 +
</div>
-
<p>You can then screw the CS mount on the board and plug the lense in.</p>
+
<br /><br />
 +
<p>The CS mount was screwed on the board and the lens plugged in.</p>
-
Here's what we see with this lense:<br />
 
 +
<div class="cntr">
 +
<a href="https://static.igem.org/mediawiki/2014/7/7c/IMG_3942.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/6c/Rasp_cam.png" alt="Camera" class="img-border img-responsive" /></a>
 +
</div>
 +
<br /><br />
 +
 +
<p>You can see here what the lens sees: <br /></p>
 +
 +
 +
<div class="cntr">
 +
<a href="https://static.igem.org/mediawiki/2014/4/4d/Showchamber_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/8/8f/Showchamber_mini.png" alt="Camera" class="img-border img-responsive" /></a>
 +
</div>
 +
<br />
<p>We can clearly see the chambers of the microfluidic chip.</p>
<p>We can clearly see the chambers of the microfluidic chip.</p>
 +
<h3 id="gfp">GFP tracking</h3>
 +
 +
<p>Once we had our device we wanted to test it with the microfluidic chip and to be as close as possible to the device we wanted at the beggining.</p>
 +
 +
<p>We decided to load some sfGFP on the microfluidic chip and track the emission coming from the chambers.</p>
 +
 +
<p>We did the following installation:</p>
 +
 +
<div class="cntr">
 +
<a href="https://static.igem.org/mediawiki/2014/b/bd/IMG_4154.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/8/88/DeviceRasp.png" alt="Device" class="img-border img-responsive" /></a>
</div>
</div>
 +
 +
<br /><br />
 +
 +
<p>In order to excite the sfGFP we built a little circuit composed of a LED with the right wavelength (470nm) and a resistor to protect the LED from burning out. We used an arduino board to have more control over it.</p>
 +
 +
 +
 +
<div class="cntr">
 +
<a href="https://static.igem.org/mediawiki/2014/6/60/IMG_4153.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/d/d1/Arduino_led.png" alt="Device" class="img-border img-responsive" /></a>
 +
</div>
 +
 +
<br />
 +
<br />
 +
 +
<a href="https://static.igem.org/mediawiki/2014/0/0d/HSV_color_solid_cylinder_alpha_lowgamma.png" data-lightbox="hsv"><img src="https://static.igem.org/mediawiki/2014/0/0d/HSV_color_solid_cylinder_alpha_lowgamma.png" alt="HSV" class="img-border img-left pull-left" style="width: 30%" /></a>
 +
 +
<p>With this set-up we retrieved a video of the microfluidic chip illuminated by the LED.
 +
 +
To analyze this video we built another OpenCV C++ program that use another different color space named HSV that decompose a pixel into three components, the Hue (color), Saturation (the intensity) and the Value (the brightness).</p>
 +
 +
<br /><br />
 +
<div class="clearfix"></div>
 +
 +
 +
 +
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/checkgfp.cpp"></script>
 +
 +
 +
<p>With this code this is the result that we get:</p>
 +
 +
 +
<div class="cntr">
 +
<video width="500" height="300" controls>
 +
  <source src="https://static.igem.org/mediawiki/2014/2/2f/Gfptracking_EPFL.mp4" type="video/mp4">
 +
</video>
 +
</div>
 +
 +
 +
 +
</div>
 +
 +
</div>
</div>
Line 234: Line 323:
         <li><a href="#lighttracking">Light tracking</a></li>
         <li><a href="#lighttracking">Light tracking</a></li>
         <li><a href="#lenses">Lenses</a></li>
         <li><a href="#lenses">Lenses</a></li>
 +
        <li><a href="#gfp">GFP tracking</a></li>
 +
     </ul>
     </ul>
</nav>
</nav>

Latest revision as of 03:07, 18 October 2014

Hardware



Having successfully engineered touch responsive bacteria, the next major step to build a functional BioPad is to detect and process the emitted signals. The microfluidic chip containing our engineered organisms has the advantage of being small and portable. We aimed to keep these characteristics all along the project. That is why, instead of using a big and cumbersome device to detect signals, we opted for a small and cheap Raspberry Pi. Let the adventure for building the BioPad Detector begin!

Raspberry Pi

Raspberry Pi

The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. The raspberry Pi will be used to monitor the light emitted by each chamber of the microfluidic chip. We will be able to detect and process all emitted signals through this small device!

When brainstorming how to build the detector, we initially drafted the follwing setup:

Raspberry Pi


As seen in the picture above, the camera is directly linked to the Raspberry Pi and is able to get a clear view of the whole chip. Our final device thus needed a small and high-resolution camera able to easily track signal emission from our touch responsive organisms (including signals emitted in the infrared spectrum).

Raspberry Pi

The camera best suited to the characteristics above was the Raspberry Pi NoIR. The Raspberry Pi NoIR is a 5 MegaPixel, 1080p, 20mm x 25mm x 9mm CMOS camera. It is especially good for low intensity signals. Moreover, NoIR stands for No Infrared Filter, meaning that with this camera we are able to see near infrared wavelength.

The near-infrared spectrum correspond to wavelengths between 700 and 1000nm. We will use the ability of the Raspberry Pi NoIR to detect near-infrared to track the IFP signal emitted by our CpxR - split IFP1.4 stress responsive cells. The emission of IFP in these wavelengths is especially useful for us, as few things emit auto-fluorescence in the infrared spectrum. This drastically increases the precision of our device as it reduces background noise.

Taking into account all the information above, the main idea driving the way we plan to detect signals through our BioPad detector is to collect the entire light spectrum including near infrared wavelengths and then use a filter to eliminate the visible spectrum.


Raspberry Pi Plan

Light tracking

To track the signal dynamics in the chambers via our detector, we wrote a custom C++ code using OpenCV able to specifically detect the exact position of the signal as well as its nature and intensity. The entire code as well as all the supporting files can be found here: https://github.com/arthurgiroux/igemtracking/. An extract of the main code is given here:




Check out the result of our program here:



YCrCb
Representation of the YCrCb color space

To detect the signal, we need to be able to get the information that we need from the pixels. The most common color space used in programming is RGB - pixel colors can be split into three components (Red, Green, and Blue) each taking a value between 0 and 255.

We used another color space, better adapted for this application as we were especially interested in light intensity and color nuances. We therefore chose the YcrCb color space.


In the YCrCb color space, each pixel is decomposed into three components:

  • Y – the luma value (intensity)
  • Cr – the red difference
  • Cb – the blue difference

This allows us to extract the necessary information for our application.

Lenses

The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.

We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.

The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted.

Camera


You will then see the camera sensor (CMOS).

Camera


The CS mount was screwed on the board and the lens plugged in.

Camera


You can see here what the lens sees:

Camera

We can clearly see the chambers of the microfluidic chip.

GFP tracking

Once we had our device we wanted to test it with the microfluidic chip and to be as close as possible to the device we wanted at the beggining.

We decided to load some sfGFP on the microfluidic chip and track the emission coming from the chambers.

We did the following installation:

Device


In order to excite the sfGFP we built a little circuit composed of a LED with the right wavelength (470nm) and a resistor to protect the LED from burning out. We used an arduino board to have more control over it.

Device


HSV

With this set-up we retrieved a video of the microfluidic chip illuminated by the LED. To analyze this video we built another OpenCV C++ program that use another different color space named HSV that decompose a pixel into three components, the Hue (color), Saturation (the intensity) and the Value (the brightness).



With this code this is the result that we get:

Sponsors