Team:EPF Lausanne/Hardware
From 2014.igem.org
Arthurgiroux (Talk | contribs) |
|||
Line 134: | Line 134: | ||
- | With | + | With microorganisms now responding to touch, we have a signal that we want to use. To do that, we need to detect and analyze it. One simple and fast way to do it is through a computer.<br /> |
- | + | Our microfluidic chip is small and portable and we wanted to keep this trait all along the project. That is why, instead of using a big and cumbersome device, we opted for a Raspberry Pi.<br /><br /></p> | |
<h3 id="raspberry">Raspberry Pi</h3> | <h3 id="raspberry">Raspberry Pi</h3> | ||
- | |||
<a href="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" alt="Raspberry Pi" class="pull-left img-left img-border" style="width: 30%;" /></a> | <a href="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/e/ec/121220065212-raspberry-pi-close-up-horizontal-gallery.jpg" alt="Raspberry Pi" class="pull-left img-left img-border" style="width: 30%;" /></a> | ||
<br /><br /> | <br /><br /> | ||
- | <p>The Raspberry Pi is a cheap (40.-) | + | <p>The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. We are going to use it in order to monitor the light emitted by each chamber of the microfluidic chip.<br /><br /></p> |
Line 155: | Line 154: | ||
<div class="cntr"> | <div class="cntr"> | ||
<a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border img-responsive" /></a></div> | <a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border img-responsive" /></a></div> | ||
- | |||
<br /> | <br /> | ||
<br /> | <br /> | ||
- | <p>In order to track the light we need a | + | <p>In order to track the light we need a high-resolution camera which is easily implementable in the final system.<br /></p> |
<a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a> | <a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a> | ||
- | < | + | <pThere was a choice of cameras with this characteristics, and we chose the Raspberry Pi NoIR. NoIR stands for No Infrared Filter, meaning that with this camera we are able to see near infrared wavelength.<br /></p> |
- | <p>Near infrared | + | <p>Near infrared corresponds to wavelengths between 700 and 1000nm. We will use this feature of the camera to track the IFP signal. |
+ | This is especially useful for us, as few things emit infrared signal, it reduces the background noise <br /></p> | ||
- | <p>The | + | <p>The idea is to collect the spectrum up to near infrared wavelengths and then use a filter to eliminate the visible spectrum.</p> |
<br /> | <br /> | ||
Line 175: | Line 174: | ||
</div> | </div> | ||
<br /> | <br /> | ||
- | |||
<h3 id="lighttracking">Light tracking</h3> | <h3 id="lighttracking">Light tracking</h3> | ||
- | <p>In order to view the light change in a chamber we need to track the light intensity. | + | <p>In order to view the light change in a chamber, we need to track the light intensity. |
- | For this part we made a custom C++ code using OpenCV (you can find the entire code there <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>)</p> | + | For this part, we made a custom C++ code using OpenCV (you can find the entire code there <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>).</p> |
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script> | <script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script> | ||
- | <p> | + | <p>You can find the result output here:</p> |
<p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p> | <p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p> | ||
+ | <p>The most common color space used in programming is RGB, where the pixel color can be split into three components - Red, Green, Blue – each taking a value between 0 and 255.</p> | ||
- | <p> | + | <p>However, there are many other color spaces, each adapted to a different application. In our case, we are interested in light intensity and not in colors. This made us choose another color space: the YcrCb.</p> |
- | <p> | + | <p>In the YCrCb color space, each pixel is decomposed into three components: |
+ | |||
+ | <ul> | ||
+ | <li>Y – the luma value (intensity)</li> | ||
+ | <li>Cr – the red difference</li> | ||
+ | <li>Cb – the blue difference</li> | ||
+ | </ul> | ||
+ | </p> | ||
- | |||
<div class="pull-left img-left" style="width: 30%"> | <div class="pull-left img-left" style="width: 30%"> | ||
Line 203: | Line 208: | ||
<h3 id="lenses">Lenses</h3> | <h3 id="lenses">Lenses</h3> | ||
- | <p>The Raspberry Pi camera that we use has a fixed | + | <p>The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.</p> |
- | <p>We searched for a | + | <p>We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.</p> |
- | + | <p>The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted. </p> | |
- | + | ||
- | <p>The first thing | + | |
INSERT IMAGE | INSERT IMAGE | ||
Line 215: | Line 218: | ||
<p>You will then see the camera sensor (CMOS).</p> | <p>You will then see the camera sensor (CMOS).</p> | ||
+ | <p>The CS mount was screwed on the board and the lens plugged in.</p> | ||
- | <p>You can | + | <p>You can see here what the lens sees: <br /></p> |
- | + | ||
- | + | ||
+ | IMAGE!!! | ||
<p>We can clearly see the chambers of the microfluidic chip.</p> | <p>We can clearly see the chambers of the microfluidic chip.</p> |
Revision as of 16:04, 17 October 2014
Hardware
With microorganisms now responding to touch, we have a signal that we want to use. To do that, we need to detect and analyze it. One simple and fast way to do it is through a computer.
Our microfluidic chip is small and portable and we wanted to keep this trait all along the project. That is why, instead of using a big and cumbersome device, we opted for a Raspberry Pi.
Raspberry Pi
The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. We are going to use it in order to monitor the light emitted by each chamber of the microfluidic chip.
This is what the initial device looked like:
In order to track the light we need a high-resolution camera which is easily implementable in the final system.
Near infrared corresponds to wavelengths between 700 and 1000nm. We will use this feature of the camera to track the IFP signal.
This is especially useful for us, as few things emit infrared signal, it reduces the background noise
The idea is to collect the spectrum up to near infrared wavelengths and then use a filter to eliminate the visible spectrum.
Light tracking
In order to view the light change in a chamber, we need to track the light intensity. For this part, we made a custom C++ code using OpenCV (you can find the entire code there https://github.com/arthurgiroux/igemtracking/).
You can find the result output here:
The most common color space used in programming is RGB, where the pixel color can be split into three components - Red, Green, Blue – each taking a value between 0 and 255.
However, there are many other color spaces, each adapted to a different application. In our case, we are interested in light intensity and not in colors. This made us choose another color space: the YcrCb.
In the YCrCb color space, each pixel is decomposed into three components:
- Y – the luma value (intensity)
- Cr – the red difference
- Cb – the blue difference
Lenses
The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.
We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.
The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted.
INSERT IMAGEYou will then see the camera sensor (CMOS).
The CS mount was screwed on the board and the lens plugged in.
You can see here what the lens sees:
We can clearly see the chambers of the microfluidic chip.