Team:EPF Lausanne/Hardware

From 2014.igem.org

(Difference between revisions)
Line 134: Line 134:
-
With the living organisms now responding to touch, we have a signal that we need to retrieve and then analyze to make use of it. One simple and fast way to do it is through a computer.<br />
+
With microorganisms now responding to touch, we have a signal that we want to use. To do that, we need to detect and analyze it. One simple and fast way to do it is through a computer.<br />
-
Because our microfluidic chip is small and portable, we wanted to keep this trait all along the project and that's why instead of using a big and cumbersome device we opted for a Raspberry Pi.<br /><br />
+
Our microfluidic chip is small and portable and we wanted to keep this trait all along the project. That is why, instead of using a big and cumbersome device, we opted for a Raspberry Pi.<br /><br /></p>
<h3 id="raspberry">Raspberry Pi</h3>
<h3 id="raspberry">Raspberry Pi</h3>
-
 
<a href="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" alt="Raspberry Pi" class="pull-left img-left img-border" style="width: 30%;" /></a>
<a href="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" alt="Raspberry Pi" class="pull-left img-left img-border" style="width: 30%;" /></a>
<br /><br />
<br /><br />
-
<p>The Raspberry Pi is a cheap (40.-) and small single-board computer that we are going to use in order to monitor the light in each of the chambers of the microfluidic chip.<br /><br /></p>
+
<p>The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. We are going to use it in order to monitor the light emitted by each chamber of the microfluidic chip.<br /><br /></p>
Line 155: Line 154:
<div class="cntr">
<div class="cntr">
<a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border img-responsive" /></a></div>
<a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border img-responsive" /></a></div>
-
 
<br />
<br />
<br />
<br />
-
<p>In order to track the light we need a good camera that is cheap and allows us to be flexible.<br /></p>
+
<p>In order to track the light we need a high-resolution camera which is easily implementable in the final system.<br /></p>
<a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a>
<a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a>
-
<p>We chose the Raspberry Pi NoIR, NoIR stands for No Infrared Filter meaning that with this camera we are able to see near infrared wavelength.<br /></p>
+
<pThere was a choice of cameras with this characteristics, and we chose the Raspberry Pi NoIR. NoIR stands for No Infrared Filter, meaning that with this camera we are able to see near infrared wavelength.<br /></p>
-
<p>Near infrared correspond to wavelength between 700 and 1000nm we will use this feature of the camera to track the IFP. Moreover getting our signal in near infrared instead of the visible spectrum is a nice feature because not a lot of things emit near infrared so if we see something we can be quite sure that it's our bacterias, whereas if we only rely on the visible spectrum and we have no depth information, it's very hard to tell if it's our organisms emitting light or something else with similar color / light intensity.<br /></p>
+
<p>Near infrared corresponds to wavelengths between 700 and 1000nm. We will use this feature of the camera to track the IFP signal.  
 +
This is especially useful for us, as few things emit infrared signal, it reduces the background noise <br /></p>
-
<p>The best plan is then to get the spectrum up to near infrared and then use a filter to get rid of the visible spectrum.</p>
+
<p>The idea is to collect the spectrum up to near infrared wavelengths and then use a filter to eliminate the visible spectrum.</p>
<br />
<br />
Line 175: Line 174:
</div>
</div>
<br />
<br />
-
 
<h3 id="lighttracking">Light tracking</h3>
<h3 id="lighttracking">Light tracking</h3>
-
<p>In order to view the light change in a chamber we need to track the light intensity.
+
<p>In order to view the light change in a chamber, we need to track the light intensity.
-
For this part we made a custom C++ code using OpenCV (you can find the entire code there <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>)</p>
+
For this part, we made a custom C++ code using OpenCV (you can find the entire code there <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>).</p>
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script>
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script>
-
<p>Here's the result output:</p>
+
<p>You can find the result output here:</p>
<p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p>
<p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p>
 +
<p>The most common color space used in programming is RGB, where the pixel color can be split into three components - Red, Green, Blue – each taking a value between 0 and 255.</p>
-
<p>The most common color space that program uses is RGB where the pixel color can be split into three components Red, Green and Blue with values between 0 and 255.</p>
+
<p>However, there are many other color spaces, each adapted to a different application. In our case, we are interested in light intensity and not in colors. This made us choose another color space: the YcrCb.</p>
-
  <p>But there exists a lot of other color space that can be used for different applications, in our case we are interested in light intensity and not really in colors.</p>
+
  <p>In the YCrCb color space, each pixel is decomposed into three components:
 +
 
 +
<ul>
 +
<li>Y – the luma value (intensity)</li>
 +
<li>Cr – the red difference</li>
 +
<li>Cb – the blue difference</li>
 +
</ul>
 +
</p>
-
<p>In our code we use the YCrCb color space, it decomposes a pixel into three components Y: the luma value (the intensity), Cr is the red difference and Cb is the blue difference.</p>
 
<div class="pull-left img-left" style="width: 30%">
<div class="pull-left img-left" style="width: 30%">
Line 203: Line 208:
<h3 id="lenses">Lenses</h3>
<h3 id="lenses">Lenses</h3>
-
  <p>The Raspberry Pi camera that we use has a fixed lense that is not really suitable for us because we can't change the focus, the aperture or the zoom.</p>
+
  <p>The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.</p>
-
<p>We searched for a way to change the lense and have the more control possible on it and we found that the easiest way was to remove the initial lense, put a CS mount on it and attach a much bigger lense.</p>
+
<p>We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.</p>
-
<p>We found some cheap lense online (~45$) and we did the mounting.</p>
+
<p>The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted. </p>
-
 
+
-
<p>The first thing to do is to unplug the camera module from the PCB, once it's done you need to remove the lense carefully by unscrewing it.</p>
+
INSERT IMAGE
INSERT IMAGE
Line 215: Line 218:
<p>You will then see the camera sensor (CMOS).</p>
<p>You will then see the camera sensor (CMOS).</p>
 +
<p>The CS mount was screwed on the board and the lens plugged in.</p>
-
<p>You can then screw the CS mount on the board and plug the lense in.</p>
+
<p>You can see here what the lens sees: <br /></p>
-
 
+
-
Here's what we see with this lense:<br />
+
 +
IMAGE!!!
<p>We can clearly see the chambers of the microfluidic chip.</p>
<p>We can clearly see the chambers of the microfluidic chip.</p>

Revision as of 16:04, 17 October 2014

Hardware





With microorganisms now responding to touch, we have a signal that we want to use. To do that, we need to detect and analyze it. One simple and fast way to do it is through a computer.
Our microfluidic chip is small and portable and we wanted to keep this trait all along the project. That is why, instead of using a big and cumbersome device, we opted for a Raspberry Pi.

Raspberry Pi

Raspberry Pi

The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. We are going to use it in order to monitor the light emitted by each chamber of the microfluidic chip.



This is what the initial device looked like:

Raspberry Pi


In order to track the light we need a high-resolution camera which is easily implementable in the final system.

Raspberry Pi

Near infrared corresponds to wavelengths between 700 and 1000nm. We will use this feature of the camera to track the IFP signal. This is especially useful for us, as few things emit infrared signal, it reduces the background noise

The idea is to collect the spectrum up to near infrared wavelengths and then use a filter to eliminate the visible spectrum.


Raspberry Pi Plan

Light tracking

In order to view the light change in a chamber, we need to track the light intensity. For this part, we made a custom C++ code using OpenCV (you can find the entire code there https://github.com/arthurgiroux/igemtracking/).

You can find the result output here:

The most common color space used in programming is RGB, where the pixel color can be split into three components - Red, Green, Blue – each taking a value between 0 and 255.

However, there are many other color spaces, each adapted to a different application. In our case, we are interested in light intensity and not in colors. This made us choose another color space: the YcrCb.

In the YCrCb color space, each pixel is decomposed into three components:

  • Y – the luma value (intensity)
  • Cr – the red difference
  • Cb – the blue difference

YCrCb
Representation of the YCrCb color space

Lenses

The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.

We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.

The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted.

INSERT IMAGE

You will then see the camera sensor (CMOS).

The CS mount was screwed on the board and the lens plugged in.

You can see here what the lens sees:

IMAGE!!!

We can clearly see the chambers of the microfluidic chip.

Sponsors