Team:EPF Lausanne/Hardware
From 2014.igem.org
(29 intermediate revisions not shown) | |||
Line 134: | Line 134: | ||
- | Having successfully engineered touch responsive bacteria, the next major step to build a functional BioPad is to detect and process the emitted signals. The microfluidic chip containing our engineered organisms has the advantage of being small and portable. We aimed to keep these characteristics all along the project. That is why, instead of using a big and cumbersome device to detect signals, we opted for a small and cheap Raspberry Pi. Let the adventure for building the BioPad Detector begin !<br /><br /> | + | Having successfully engineered touch responsive bacteria, the next major step to build a functional BioPad is to detect and process the emitted signals. The microfluidic chip containing our engineered organisms has the advantage of being small and portable. We aimed to keep these characteristics all along the project. That is why, instead of using a big and cumbersome device to detect signals, we opted for a small and cheap Raspberry Pi. Let the adventure for building the BioPad Detector begin!<br /><br /> |
</p> | </p> | ||
Line 143: | Line 143: | ||
<br /> | <br /> | ||
- | <p>The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. The raspberry Pi will be used to monitor the light emitted by each chamber of the microfluidic chip. We will be able to detect and process all emitted signals through this small device !<br /><br /></p> | + | <p>The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. The raspberry Pi will be used to monitor the light emitted by each chamber of the microfluidic chip. We will be able to detect and process all emitted signals through this small device!<br /><br /></p> |
Line 152: | Line 152: | ||
<p>When brainstorming how to build the detector, we initially drafted the follwing setup: </p> | <p>When brainstorming how to build the detector, we initially drafted the follwing setup: </p> | ||
<div class="cntr"> | <div class="cntr"> | ||
- | <a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border | + | <a href="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/3/37/Raspberry_.png" alt="Raspberry Pi" class="img-border" style="width: 70%" /></a></div> |
<br /> | <br /> | ||
<br /> | <br /> | ||
- | <p>As seen in the picture above, the camera is directly linked to the Raspberry Pi and is able to get a clear view of the whole chip. Our final thus needed a small and high-resolution camera able to easily track signal emission from our touch responsive organisms (including signals emitted in the infrared spectrum). <br /></p> | + | <p>As seen in the picture above, the camera is directly linked to the Raspberry Pi and is able to get a clear view of the whole chip. Our final device thus needed a small and high-resolution camera able to easily track signal emission from our touch responsive organisms (including signals emitted in the infrared spectrum). <br /></p> |
<a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a> | <a href="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/61/1567kit_LRG.jpg" alt="Raspberry Pi" class="pull-right img-right img-border" style="width: 30%;" /></a> | ||
- | < | + | <p>The camera best suited to the characteristics above was the Raspberry Pi NoIR. The Raspberry Pi NoIR is a 5 MegaPixel, 1080p, 20mm x 25mm x 9mm CMOS camera. It is especially good for low intensity signals. Moreover, NoIR stands for No Infrared Filter, meaning that with this camera we are able to see near infrared wavelength.<br /></p> |
- | <p>The | + | <p>The near-infrared spectrum correspond to wavelengths between 700 and 1000nm. We will use the ability of the Raspberry Pi NoIR to detect near-infrared to track the IFP signal emitted by our CpxR - split IFP1.4 stress responsive cells. The emission of IFP in these wavelengths is especially useful for us, as few things emit auto-fluorescence in the infrared spectrum. This drastically increases the precision of our device as it reduces background noise.<br /></p> |
- | <p>Taking | + | <p>Taking into account all the information above, the main idea driving the way we plan to detect signals through our BioPad detector is to collect the entire light spectrum including near infrared wavelengths and then use a filter to eliminate the visible spectrum.</p> |
<br /> | <br /> | ||
<div class="cntr"> | <div class="cntr"> | ||
- | <a href="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" alt="Raspberry Pi Plan" class="img-border | + | <a href="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/0/04/Raspberry_plan.png" alt="Raspberry Pi Plan" class="img-border" style="width: 70%" /></a> |
</div> | </div> | ||
<br /> | <br /> | ||
Line 175: | Line 175: | ||
<h3 id="lighttracking">Light tracking</h3> | <h3 id="lighttracking">Light tracking</h3> | ||
- | <p>To track the signal dynamics in the chambers via our detector, we wrote a custom C++ code using OpenCV able to specifically detect the exact position of the signal as well as its nature and intensity. The entire code as well as all the supporting files can be found here: <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a> | + | <p>To track the signal dynamics in the chambers via our detector, we wrote a custom C++ code using OpenCV able to specifically detect the exact position of the signal as well as its nature and intensity. The entire code as well as all the supporting files can be found here: <a href="https://github.com/arthurgiroux/igemtracking/" target="_blank">https://github.com/arthurgiroux/igemtracking/</a>. An extract of the main code is given here:</p> |
<script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script> | <script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/tracking.cpp"></script> | ||
- | <p> | + | <br/><br/><br/> |
+ | |||
+ | <p> Check out the result of our program here:</p> | ||
<p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p> | <p><img src="https://static.igem.org/mediawiki/2014/c/cd/Lighttracking2.png" alt="" class="img-responsive" /></p> | ||
- | <p>The most common color space used in programming is RGB | + | <div class = "cntr"> |
+ | <video width="500" height="300" controls> | ||
+ | <source src="https://static.igem.org/mediawiki/2014/f/f6/Trackinglight2.mp4" type="video/mp4"> | ||
+ | </video> | ||
+ | </div> | ||
+ | |||
+ | <br /> | ||
+ | <br /> | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | <div class="pull-left img-left" style="width: 30%"> | ||
+ | <a href=" https://static.igem.org/mediawiki/2014/b/b8/YCbCr.GIF" data-lightbox="raspberry"><img src=" https://static.igem.org/mediawiki/2014/b/b8/YCbCr.GIF" alt="YCrCb" class="img-border" /></a><br /> | ||
+ | <figcaption class="cntr">Representation of the YCrCb color space</figcaption> | ||
+ | </div> | ||
+ | <p>To detect the signal, we need to be able to get the information that we need from the pixels. The most common color space used in programming is RGB - pixel colors can be split into three components (Red, Green, and Blue) each taking a value between 0 and 255.</p> | ||
- | <p> | + | <p>We used another color space, better adapted for this application as we were especially interested in light intensity and color nuances. We therefore chose the YcrCb color space.</p> |
+ | <br/> | ||
- | <p>In the YCrCb color space, each pixel is decomposed into three components: | + | <p>In the YCrCb color space, each pixel is decomposed into three components: </p> |
<ul> | <ul> | ||
Line 193: | Line 212: | ||
<li>Cb – the blue difference</li> | <li>Cb – the blue difference</li> | ||
</ul> | </ul> | ||
+ | |||
+ | |||
+ | <p> | ||
+ | This allows us to extract the necessary information for our application. | ||
</p> | </p> | ||
- | |||
- | |||
- | |||
- | |||
Line 210: | Line 229: | ||
<p>The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted. </p> | <p>The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted. </p> | ||
- | + | <div class="cntr"> | |
+ | <a href="https://static.igem.org/mediawiki/2014/c/c7/IMG_3936.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/2/23/Cam.png" alt="Camera" class="img-border img-responsive" /></a> | ||
+ | </div> | ||
+ | <br /> | ||
+ | <br /> | ||
<p>You will then see the camera sensor (CMOS).</p> | <p>You will then see the camera sensor (CMOS).</p> | ||
+ | <div class="cntr"> | ||
+ | <a href="https://static.igem.org/mediawiki/2014/5/5f/IMG_3937.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/1/17/Naked_cam.png" alt="Camera" class="img-border img-responsive" /></a> | ||
+ | </div> | ||
+ | |||
+ | <br /><br /> | ||
<p>The CS mount was screwed on the board and the lens plugged in.</p> | <p>The CS mount was screwed on the board and the lens plugged in.</p> | ||
+ | |||
+ | |||
+ | <div class="cntr"> | ||
+ | <a href="https://static.igem.org/mediawiki/2014/7/7c/IMG_3942.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/6/6c/Rasp_cam.png" alt="Camera" class="img-border img-responsive" /></a> | ||
+ | </div> | ||
+ | |||
+ | <br /><br /> | ||
<p>You can see here what the lens sees: <br /></p> | <p>You can see here what the lens sees: <br /></p> | ||
- | |||
+ | <div class="cntr"> | ||
+ | <a href="https://static.igem.org/mediawiki/2014/4/4d/Showchamber_.png" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/8/8f/Showchamber_mini.png" alt="Camera" class="img-border img-responsive" /></a> | ||
+ | </div> | ||
+ | <br /> | ||
<p>We can clearly see the chambers of the microfluidic chip.</p> | <p>We can clearly see the chambers of the microfluidic chip.</p> | ||
+ | <h3 id="gfp">GFP tracking</h3> | ||
+ | |||
+ | <p>Once we had our device we wanted to test it with the microfluidic chip and to be as close as possible to the device we wanted at the beggining.</p> | ||
+ | |||
+ | <p>We decided to load some sfGFP on the microfluidic chip and track the emission coming from the chambers.</p> | ||
+ | |||
+ | <p>We did the following installation:</p> | ||
+ | |||
+ | <div class="cntr"> | ||
+ | <a href="https://static.igem.org/mediawiki/2014/b/bd/IMG_4154.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/8/88/DeviceRasp.png" alt="Device" class="img-border img-responsive" /></a> | ||
</div> | </div> | ||
+ | |||
+ | <br /><br /> | ||
+ | |||
+ | <p>In order to excite the sfGFP we built a little circuit composed of a LED with the right wavelength (470nm) and a resistor to protect the LED from burning out. We used an arduino board to have more control over it.</p> | ||
+ | |||
+ | |||
+ | |||
+ | <div class="cntr"> | ||
+ | <a href="https://static.igem.org/mediawiki/2014/6/60/IMG_4153.JPG" data-lightbox="raspberry"><img src="https://static.igem.org/mediawiki/2014/d/d1/Arduino_led.png" alt="Device" class="img-border img-responsive" /></a> | ||
+ | </div> | ||
+ | |||
+ | <br /> | ||
+ | <br /> | ||
+ | |||
+ | <a href="https://static.igem.org/mediawiki/2014/0/0d/HSV_color_solid_cylinder_alpha_lowgamma.png" data-lightbox="hsv"><img src="https://static.igem.org/mediawiki/2014/0/0d/HSV_color_solid_cylinder_alpha_lowgamma.png" alt="HSV" class="img-border img-left pull-left" style="width: 30%" /></a> | ||
+ | |||
+ | <p>With this set-up we retrieved a video of the microfluidic chip illuminated by the LED. | ||
+ | |||
+ | To analyze this video we built another OpenCV C++ program that use another different color space named HSV that decompose a pixel into three components, the Hue (color), Saturation (the intensity) and the Value (the brightness).</p> | ||
+ | |||
+ | <br /><br /> | ||
+ | <div class="clearfix"></div> | ||
+ | |||
+ | |||
+ | |||
+ | <script src="http://gist-it.sudarmuthu.com/https://github.com/arthurgiroux/igemtracking/blob/master/src/checkgfp.cpp"></script> | ||
+ | |||
+ | |||
+ | <p>With this code this is the result that we get:</p> | ||
+ | |||
+ | |||
+ | <div class="cntr"> | ||
+ | <video width="500" height="300" controls> | ||
+ | <source src="https://static.igem.org/mediawiki/2014/2/2f/Gfptracking_EPFL.mp4" type="video/mp4"> | ||
+ | </video> | ||
+ | </div> | ||
+ | |||
+ | |||
+ | |||
+ | </div> | ||
+ | |||
+ | |||
</div> | </div> | ||
Line 233: | Line 323: | ||
<li><a href="#lighttracking">Light tracking</a></li> | <li><a href="#lighttracking">Light tracking</a></li> | ||
<li><a href="#lenses">Lenses</a></li> | <li><a href="#lenses">Lenses</a></li> | ||
+ | <li><a href="#gfp">GFP tracking</a></li> | ||
+ | |||
</ul> | </ul> | ||
</nav> | </nav> |
Latest revision as of 03:07, 18 October 2014
Hardware
Having successfully engineered touch responsive bacteria, the next major step to build a functional BioPad is to detect and process the emitted signals. The microfluidic chip containing our engineered organisms has the advantage of being small and portable. We aimed to keep these characteristics all along the project. That is why, instead of using a big and cumbersome device to detect signals, we opted for a small and cheap Raspberry Pi. Let the adventure for building the BioPad Detector begin!
Raspberry Pi
The Raspberry Pi is a small and cheap (40.- CHF) single-board computer. The raspberry Pi will be used to monitor the light emitted by each chamber of the microfluidic chip. We will be able to detect and process all emitted signals through this small device!
When brainstorming how to build the detector, we initially drafted the follwing setup:
As seen in the picture above, the camera is directly linked to the Raspberry Pi and is able to get a clear view of the whole chip. Our final device thus needed a small and high-resolution camera able to easily track signal emission from our touch responsive organisms (including signals emitted in the infrared spectrum).
The camera best suited to the characteristics above was the Raspberry Pi NoIR. The Raspberry Pi NoIR is a 5 MegaPixel, 1080p, 20mm x 25mm x 9mm CMOS camera. It is especially good for low intensity signals. Moreover, NoIR stands for No Infrared Filter, meaning that with this camera we are able to see near infrared wavelength.
The near-infrared spectrum correspond to wavelengths between 700 and 1000nm. We will use the ability of the Raspberry Pi NoIR to detect near-infrared to track the IFP signal emitted by our CpxR - split IFP1.4 stress responsive cells. The emission of IFP in these wavelengths is especially useful for us, as few things emit auto-fluorescence in the infrared spectrum. This drastically increases the precision of our device as it reduces background noise.
Taking into account all the information above, the main idea driving the way we plan to detect signals through our BioPad detector is to collect the entire light spectrum including near infrared wavelengths and then use a filter to eliminate the visible spectrum.
Light tracking
To track the signal dynamics in the chambers via our detector, we wrote a custom C++ code using OpenCV able to specifically detect the exact position of the signal as well as its nature and intensity. The entire code as well as all the supporting files can be found here: https://github.com/arthurgiroux/igemtracking/. An extract of the main code is given here:
Check out the result of our program here:
To detect the signal, we need to be able to get the information that we need from the pixels. The most common color space used in programming is RGB - pixel colors can be split into three components (Red, Green, and Blue) each taking a value between 0 and 255.
We used another color space, better adapted for this application as we were especially interested in light intensity and color nuances. We therefore chose the YcrCb color space.
In the YCrCb color space, each pixel is decomposed into three components:
- Y – the luma value (intensity)
- Cr – the red difference
- Cb – the blue difference
This allows us to extract the necessary information for our application.
Lenses
The Raspberry Pi camera that we use has a fixed lens which is not adapted to what we want to do, as we cannot change the focus, the aperture or the zoom.
We searched for a different lens which would allow us more control, and found that the easiest way was to remove the initial Raspberry Pi lens, put a CS mount on it and attach a much bigger lens.
The first thing we did was to unplug the camera module from the PCB. Then, the lens was carefully removed by unscrewing it and the new lens was mounted.
You will then see the camera sensor (CMOS).
The CS mount was screwed on the board and the lens plugged in.
You can see here what the lens sees:
We can clearly see the chambers of the microfluidic chip.
GFP tracking
Once we had our device we wanted to test it with the microfluidic chip and to be as close as possible to the device we wanted at the beggining.
We decided to load some sfGFP on the microfluidic chip and track the emission coming from the chambers.
We did the following installation:
In order to excite the sfGFP we built a little circuit composed of a LED with the right wavelength (470nm) and a resistor to protect the LED from burning out. We used an arduino board to have more control over it.
With this set-up we retrieved a video of the microfluidic chip illuminated by the LED. To analyze this video we built another OpenCV C++ program that use another different color space named HSV that decompose a pixel into three components, the Hue (color), Saturation (the intensity) and the Value (the brightness).
With this code this is the result that we get: