tag:blogger.com,1999:blog-30464292090725959722024-02-19T15:11:07.999-08:00 Kartik MadhiraRobotics.Embedded.Image Processing.Miscellaneous Projects.Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.comBlogger24125tag:blogger.com,1999:blog-3046429209072595972.post-9266203945699662652018-01-03T04:21:00.002-08:002018-06-17T04:15:18.827-07:00Publications:<div dir="ltr" style="text-align: left;" trbidi="on">
<a href="https://scholar.google.co.in/citations?user=zva59GQAAAAJ&hl=en" target="_blank">Google Scholar profile</a></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-50328077525164787122017-08-14T11:23:00.004-07:002017-08-14T11:23:58.591-07:00ROS Autonomous Bot - Clerkbot - WiFi Teleop<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
It's almost been 2-3 months since I last posted about Clerkbot. This is largely due to my moving to Bangalore and new job is really hectic as a Data Scientist. I had some of the videos remaining to be posted and hence this post. I was facing a lot of issues with setting up a fully functional WiFi operable Clerkbot setup.</div>
<div style="text-align: justify;">
One of the reasons for this was the power setup for the whole system. I was running a WiFi router, two motors sucking almost 1 Amps each and not to forget the NVIDIA Jetson TK1 board that was also running the RPLIDAR A2 on a 12 V 2200 LiPo mAH battery. One big mistake I did was not balancing the LiPo battery as unbalanced cells tend to create a P.D among themselves and what you get as an output is not sufficient amperage to power the whole setup. I guess this is the best possible explanation for the weak power output I was getting, but then I can be wrong. I was using an externally powered USB 2.0 extension adapter(via Powerbank) on the NVIDIA to power the RPLIDAR but with no help. This was causing a weird delay on the messages sent from base Ubuntu ROS station to the bot. Again, I tried using the 3.0 USB adapter, but was underpowered from the Jetson TK1 board. The delay went away but then 500 mA is not enough to power a 3.0 USB adapter and hence the bot didn't respond after about a minute. Here's a look at the setup:</div>
<div style="text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/P_mOhZgHFOg/0.jpg" src="https://www.youtube.com/embed/P_mOhZgHFOg?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<br /></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-1628651959853582652017-07-12T00:18:00.002-07:002017-07-12T00:18:20.248-07:00ROS Autonomous Bot (Clerkbot) - Distributed Network - WiFi Teleop Keyboard <div dir="ltr" style="text-align: left;" trbidi="on">
During the month of May I had been using a wired setup for the Clerkbot and it was becoming increasingly difficult to get the map of classrooms in Nirma. The penultimate goal of the setup is to have a fully autonomous robot. The previous thought when setting up the robot, was that the NVIDIA board is powerful enough for the ROS setup. But it is not and it would have been better if we would have got hands on with the Jetson TX1. It is more powerful and is also used by the MIT autonomous racing team also.<br />
<br />
The Jetson TK1 is just and just powerful enough to run a ROS based setup albeit by degrading the quality of the autonomous setup. I had to make the planner frequency and the resolution of costmaps to the lowest values possible. If anyone is trying to make a ROS based robot, I would not recommend the Jetson TK1. Or it may just be possible that being an amateur C++ programmer, I made a computationally expensive node for the odometry publisher and the tf publisher. This makes the system very unstable. Nonetheless, I did manage to get a distributed setup on the Clerkbot. In a distributed system, the onboard computer has the navigation setup - GMapping Node for the map creation, Odometry publisher and the RPlidar node. The base computer contains the brains of the setup - The navigation planner and localization data will be forwarded to the Jetson TK1 from WiFi and therefore the overall setup gets 'Distributed' between the Jetson TK1 and base computer. I put in a WiFi router to provide for the wireless data transmission between the two setups.<br />
Here's a youtube video of the setup:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/P_mOhZgHFOg/0.jpg" src="https://www.youtube.com/embed/P_mOhZgHFOg?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<br /></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-43295233924206985442017-04-16T07:50:00.001-07:002017-04-16T07:50:28.383-07:00Probabilistic Robotics - Odometry - Velocity based Model<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Autonomous navigation heavily relies on probabilities. The hard part isn't getting to build ROS based robots. But actually understand what are the underlying parameters and algorithms. One such book that is literally a bible to the approach of these algorithms is the book by S. Thrun, W. Burgard, and D. Fox - <a href="https://docs.ufpr.br/~danielsantos/ProbabilisticRobotics.pdf" target="_blank">Probabilistic Robotics.</a></div>
<div style="text-align: justify;">
I first thought of directly publishing the odometry of Clerkbot using the ROS nodes. Not knowing the underlying models and states isn't very good for anyone interested in robotics. Here's my take on the Velocity based model based on the book suggested which is also the model which the <a href="http://wiki.ros.org/base_local_planner" target="_blank">base_local_planner</a> works. This is because inherently navigation planning uses velocity commands to plan for obstacle avoidance. The odometer readings are only helpful after the command of control has been given.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
There are two models into the probabilistic kinematics:</div>
<br />
<ol style="text-align: left;">
<li style="text-align: justify;"><b>Odometric Model</b></li>
<li style="text-align: justify;"><b>Velocity Model</b></li>
</ol>
<div style="text-align: justify;">
I would like to overemphasise one thing. That is in probabilistic robotics, as the name suggests, nothing is certain. You are constantly estimating states based on given inputs and due to noises present, the uncertainty is shown with the help of probability distributions. So nothing is certain when you are dealing with noises present in almost all the states. The odomteric model can be realised with just one probability distribution,</div>
<div style="text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh8KHF8joxwqPFBEteJuOU35WmoNzDstTEqqk8xlZWIEw-rYU6XcTs0kxSU9lg3Q8NsvyYcTmitHVyXLfJwT73ZETrov2rJqVZe9f-TTjKe-ga4ZFmTqZ_Oc55tvXRtyYlXD22mbUKdObAY/s1600/Screen+Shot+2017-04-16+at+6.33.53+PM.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="83" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh8KHF8joxwqPFBEteJuOU35WmoNzDstTEqqk8xlZWIEw-rYU6XcTs0kxSU9lg3Q8NsvyYcTmitHVyXLfJwT73ZETrov2rJqVZe9f-TTjKe-ga4ZFmTqZ_Oc55tvXRtyYlXD22mbUKdObAY/s200/Screen+Shot+2017-04-16+at+6.33.53+PM.png" width="200" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
This is when you want to estimate the state the <i>pose </i>of the robot at current time t=t, given the the control commands given at time t=t from the previous pose state at time t=t-1.</div>
<div class="separator" style="clear: both; text-align: justify;">
You can cross-verify here what I wanted to tell you in the previous para. That given a pose state of the robot and a control command given, you cannot be certain of the position of robot at time t=t+1 because of the noise in motors or odometers or whatever actuator you choose. You cannot be certain about your position, but you can devise a normal distribution of the position of the robot given you have the noise information.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<b>Note: </b>Odometry model is more accurate since you are getting values of the revolutions from encoders present. But for motion planning, obstacle avoidance velocity model wins the race.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
Velocity model can be broken down to the following points:</div>
<div class="" style="clear: both; text-align: left;">
</div>
<ol style="text-align: left;">
<li style="text-align: justify;">The translational and rotational velocities given at an instant is v<i>,w </i>(read as mu and omega). Now, the radius r covered is, <i>v/w.</i></li>
<li style="text-align: justify;">Given the initial point of <i>(x,y, theta), </i>the final pose can be estimated by using the equation <b>assuming error less state system.(</b>Figures 2 and 3<b>).</b></li>
<li><span style="font-weight: normal; text-align: justify;">Final equations given the initial pose and after assuming </span><b style="font-weight: bold; text-align: justify;">constant angular and translational velocity </b><span style="text-align: justify;">and delta time t. (Figure 4)</span></li>
</ol>
<b><br /></b><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJXs84YWlrzwdsFXgzZelZhhyXgULkHBv4fCiI31NhrI952iD2PfVwrFufFlD16swdFS2TNq9rXIvslhn3BSlmIIN7s13-Hg4rea0FXt1bG8VwRhuiIhRHd3S13EO1CoBLXyHlJBjCOpu2/s1600/Screen+Shot+2017-04-16+at+7.49.45+PM.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="173" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJXs84YWlrzwdsFXgzZelZhhyXgULkHBv4fCiI31NhrI952iD2PfVwrFufFlD16swdFS2TNq9rXIvslhn3BSlmIIN7s13-Hg4rea0FXt1bG8VwRhuiIhRHd3S13EO1CoBLXyHlJBjCOpu2/s320/Screen+Shot+2017-04-16+at+7.49.45+PM.png" width="320" /></a></div>
<div class="" style="clear: both; text-align: center;">
fig 2 : Rotational and translational motion</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh34dOyWGiY5TxuP9nl84S8x6kZrfJaAwXkfHw31mRak_r_3JpRve-pANt1Q3dIH7lfbEWAg80gCeqvz-bc94yac84pPry-zv5pu2pwT-aUc38d5rc5_9DmFwtiK7LxjfpDka_hSn_XLgeT/s1600/Screen+Shot+2017-04-16+at+7.51.11+PM.png" imageanchor="1"><img border="0" height="102" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh34dOyWGiY5TxuP9nl84S8x6kZrfJaAwXkfHw31mRak_r_3JpRve-pANt1Q3dIH7lfbEWAg80gCeqvz-bc94yac84pPry-zv5pu2pwT-aUc38d5rc5_9DmFwtiK7LxjfpDka_hSn_XLgeT/s200/Screen+Shot+2017-04-16+at+7.51.11+PM.png" width="200" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
fig 3: Center of circle equations</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEibZGjVFhvnmidz1ysuRBCfyCsX4iHZ0UFc2SXGWcikWofhlpV-yw5If6mn2Q-W2gIrGsIcIbeJ_Tj18n-llBfUTznrkZm6KKYsI-tTehkoO09SSV9KppazqOodwHdD1w3THzfCT1tlxRvA/s1600/Screen+Shot+2017-04-16+at+8.02.28+PM.png" imageanchor="1"><img border="0" height="112" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEibZGjVFhvnmidz1ysuRBCfyCsX4iHZ0UFc2SXGWcikWofhlpV-yw5If6mn2Q-W2gIrGsIcIbeJ_Tj18n-llBfUTznrkZm6KKYsI-tTehkoO09SSV9KppazqOodwHdD1w3THzfCT1tlxRvA/s320/Screen+Shot+2017-04-16+at+8.02.28+PM.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
fig 4: The final pose estimate(<b>error less system</b>)</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Now this is where the concept of probabilistic kinematics and noise kicks in. Consider there are noises present in the rotational and translational velocity with a noise which has zero mean and a variance b. Hence now the final velocities are, <b>containing real world noises.</b></div>
<div class="separator" style="clear: both; text-align: left;">
<b><br /></b></div>
<div class="separator" style="clear: both; text-align: center;">
<b><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEii_vRQeReex3DXmbApAHc2OQ8LgHnNtaJviSvEhC_SEvkWKzLga3Qm7gbqA-eCrOHHhU37MnDfG6RysjjJ0g7Ck0agFx16ZG77nnnghFRuMv2Dpk2ehcizqsFZ4CpiNWwLRa0V4pRcFl78/s1600/Screen+Shot+2017-04-16+at+8.09.40+PM.png" imageanchor="1"><img border="0" height="76" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEii_vRQeReex3DXmbApAHc2OQ8LgHnNtaJviSvEhC_SEvkWKzLga3Qm7gbqA-eCrOHHhU37MnDfG6RysjjJ0g7Ck0agFx16ZG77nnnghFRuMv2Dpk2ehcizqsFZ4CpiNWwLRa0V4pRcFl78/s320/Screen+Shot+2017-04-16+at+8.09.40+PM.png" width="320" /></a></b></div>
<div class="separator" style="clear: both; text-align: center;">
fig 5: Noise into the states</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
</div>
<div class="separator" style="clear: both; text-align: left;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-12141790871584990642017-04-09T10:33:00.004-07:002017-04-15T01:10:30.408-07:00ROS Autonomous Navigation Bot - [Clerkbot] - Odometry #1<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Odometry is an important aspect in autonomous navigation, but alas not an uber-essential one. There is Hector Slam algorithm that does <b>not </b>require odometry, but this bot does use it.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Odometry is important since you need to tell the ROS main channel as to where you are with respect to the environment and how much you have moved from the initial point. There's another addition, you can get the angle with respect to the initial point too.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Again, the map's origin is different, the odometry's origin is somewhere else. So are the origin of sensors(Lidar) and the geometric axis of rotation of your robot. So you're continuously sending the state of your robot's position and it is being transformed from local to global axes of the map. Same is the case of other transformations.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Basically you have nodes already available that takes care of your encoder counts. <b>But, </b>I haven't used any of these. Primarily because I wanted to know the intricacies of the working of ROS nodes. There is a tutorial on odometry too but then it is too subjective and not to the point. This being a non- holonomic robot didn't need much of the working from the mechanical aspect.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Here's the odometery tests in accordance with the sanity tests as described <a href="http://wiki.ros.org/navigation/Tutorials/Navigation%20Tuning%20Guide" target="_blank">here</a>.</div>
<div>
<br /></div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/nsbUYb6dZ3Y/0.jpg" src="https://www.youtube.com/embed/nsbUYb6dZ3Y?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com1tag:blogger.com,1999:blog-3046429209072595972.post-66029151843957369602017-04-07T22:58:00.000-07:002017-04-09T19:16:48.095-07:00Key points for Embedded Programming<div dir="ltr" style="text-align: left;" trbidi="on">
There are some intricacies involved when you look at the core assembly level in embedded systems. People who I have seen not take a formal Embedded Course or any formal C course in general, end up coding the less efficient way. This is a generalisation but I have too at some point of time while coding controllers, have been coding the wrong way. The thing is developing logic is one aspect of the story and making the most of the data types, qualifiers and specifiers is also important. This is primarily because while coding embedded systems, we generally have constraints on the flash memory or RAM or the pins available. So the better you code, the better will be the optimisation and the better the controller will perform.<br />
<br />
Take a look at this code snippet from my code <a href="https://github.com/kartikmadhira1/STM32F103XX/blob/master/nb/src/main.c" target="_blank">here</a>.<br />
<br />
1) Use of <b>static </b>and <b>const</b><br />
<br />
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisZd6Fv3ERNT1U3z5pS-2M136jINdg_3-nakZo7UGEp-0mZmjGfEm0NEMaupRESKXWR-vadYnsBE8s4HNwN4ryNm74Sc1DLDWiZufAVVVsGEE8Q7lGnQrVotskGhLGIVysJAQqRoRrfa8C/s1600/Screen+Shot+2017-04-08+at+6.29.17+AM.png" imageanchor="1"><img border="0" height="464" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisZd6Fv3ERNT1U3z5pS-2M136jINdg_3-nakZo7UGEp-0mZmjGfEm0NEMaupRESKXWR-vadYnsBE8s4HNwN4ryNm74Sc1DLDWiZufAVVVsGEE8Q7lGnQrVotskGhLGIVysJAQqRoRrfa8C/s640/Screen+Shot+2017-04-08+at+6.29.17+AM.png" width="640" /></a></div>
<div style="text-align: center;">
<div style="text-align: justify;">
<br /></div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
This is an interrupt handler for the counting of encoder states and look at the static variable. The <b>static</b> and <b>const </b>is what I want to focus here. Some key points:</div>
</div>
<div style="text-align: left;">
</div>
<ul style="text-align: left;">
<li style="text-align: justify;">What's actually happening here is that, the variable count needs to compute the counts, leave the function, go to main and come back while keeping track of the counts.</li>
<li style="text-align: justify;"><b>static </b>is used here since I want to essentially keep track of the counts, but then if I use a local variable, it will get destroyed the moment it leaves the function.</li>
<li style="text-align: justify;">Having a <b>static </b>specifier, helps here as it tells the compiler to statically allocate a fixed size for the variable. This helps in <b>retaining </b>the value of variable when scope of the function gets destroyed and refrains the compiler from making copies if the variable.</li>
<li style="text-align: justify;"><b>const </b>is a qualifier and it tells the compiler not to make any changes to the variable.</li>
<li style="text-align: justify;">a <b>global static </b>variable tends to restrict the variable to be under scope of that file only. Meaning no other file can share the same variable.</li>
</ul>
<div>
2) Use of <b>volatile</b><br />
<b><br /></b>
<br />
<div style="text-align: justify;">
This is another grey area in embedded. Not many know of it and not many use this, atleast the amateur embedded programmers. <b>volatile </b>is again a qualifier and not a storage specifier like <b>static.</b> Basically the compiler keeps on making optimisation so that the code runs faster and efficiently. When it comes to the compiler, it basically converts the high level language to a machine language and in this is where it tries to optimise the code. </div>
<div style="text-align: justify;">
So by putting a <b>volatile </b>qualifier you are basically telling the compiler not to optimise it. Here's a good <a href="http://www.geeksforgeeks.org/understanding-volatile-qualifier-c-set-1-introduction/" target="_blank">example </a></div>
<br />
3) Use of <b>pointers</b><br />
<div style="text-align: justify;">
<b><br /></b></div>
<div style="text-align: justify;">
Again a topic largely neglected by intermediate and amateur embedded programmers. Basic logic is needed to get the code running but running into optimisations and increasing the controller efficiency is also needed. I'm not going to divulge into details, but basically pointers point to addresses directly. This is important because most of the time you are writing some driver and you need the address the memory location directly.</div>
</div>
<div>
<div style="text-align: justify;">
Also, the fact that it does not let the program make copies if the variable at run time.</div>
</div>
<div>
<br />
4) Use of <b>debuggers</b><br />
<br />
<div style="text-align: justify;">
When you are making large projects, you can't do a trial and error everytime to get the bugs in the code. You ought to have a good debugger to see what happens to your code line by line.</div>
<div style="text-align: justify;">
This also ensures if you have peripherals attached to your controller, you can actually check if the data incoming is correct or not. I use a <b>gdb </b>server setup to load and debug using the st-link on STM32.<br />
<br /></div>
</div>
<div>
A short video of which you can see here:<br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/7dBRHsrSSk8/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/7dBRHsrSSk8?feature=player_embedded" width="320"></iframe></div>
<br />
<br /></div>
<div>
<br /></div>
<br />
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-26497139562087675662017-03-05T09:13:00.002-08:002017-03-05T16:00:39.282-08:00ROS Autonomous Navigation Bot - [Clerkbot] - Initial Tests<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Finally it took me three months to fully come up with this robot and just a fun fact, it took me a month to just tune the ocean of parameters. Here I say two months to build the robot, although a good 3 months to 'learn' the ROS framework which includes a Gazebo simulation of a UAV and UGV. Now that it is over, we are all geared up to the challenge of UAV Autonomous navigation.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
This whole robot setup is part of a year long research project on UAV and UGV platforms under Dr. D.K. Kothari, HOD, EC Department, Nirma University. The UGV setup is planned to be a precursor to the UAV setup as we wanted to get on 'hands' with ROS framework. Getting our hands directly on a UAV can be a daunting task, and UGV was commissioned to fulfil that exact need. The UGV is working perfectly. But if you want to know as to why exactly it is named so, you'll need to wait.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
This is going to be a tutorial cum documentation posts for the Clerkbot. In the coming videos and posts I would be dealing with all the details of the robot. I would also soon establish an open source platform for the same setup so that others can benefit from the same.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Here's the Initial tests video of the UGV:</div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/E0TS8oXNmkw/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/E0TS8oXNmkw?feature=player_embedded" width="320"></iframe></div>
<br /></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0Sarkhej-Gandhinagar Highway, Chandlodia, Gota, Ahmedabad, Gujarat 382481, India23.1284032 72.543016299999977-28.9378168 -10.074171200000023 75.1946232 155.16020379999998tag:blogger.com,1999:blog-3046429209072595972.post-75783233316079219212017-01-05T08:03:00.000-08:002017-04-07T17:02:30.665-07:00STM32 Cortex M3 Series - [STM32F103C8T6] - #2 - Encoder Interface<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Before I begin kindly keep my <a href="https://github.com/kartikmadhira1/STM32F103xx/blob/master/enc_4x_precision/src/main.c" target="_blank">GitHub</a> code as reference throughout the tutorial. I'm currently working on encoder interface to generate odometry data for my Autonomous UGV and UAV project. This can be done in various ways by encoders of various types. The one I'm having is this 300RPM motor with a quadrature encoder circuitry embedded to it's shaft. <a href="http://robokits.co.in/motors/encoder-dc-servo/high-precision-quad-encoder-geared-dc-motor-12v-300rpm" target="_blank">Here's</a> the motor I'm talking about from Robokits:<br />
<br /></div>
<div style="text-align: justify;">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSB6dapzc2SZcmwySpB8RSHorfxodnQVM7HPzNUW_CgJhYXLud6ZroWu1FQyM-7MtHK_vTIycj_m4dYbNNtUW9dbS7ClADjV0FbrtLXHegJMIZlztGR-CpFwVx5yiZtezHTcLFFaAX8BfA/s1600/WhatsApp+Image+2017-01-05+at+9.12.15+PM.jpeg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSB6dapzc2SZcmwySpB8RSHorfxodnQVM7HPzNUW_CgJhYXLud6ZroWu1FQyM-7MtHK_vTIycj_m4dYbNNtUW9dbS7ClADjV0FbrtLXHegJMIZlztGR-CpFwVx5yiZtezHTcLFFaAX8BfA/s320/WhatsApp+Image+2017-01-05+at+9.12.15+PM.jpeg" width="240" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<i><br /></i></div>
<div class="separator" style="clear: both; text-align: center;">
<i>Figure 1. The Quadrature Encoder Motor</i></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
The encoder circuitry has +5, and GND as logic inputs and two channel outputs A and B giving timely square outputs.</div>
<div style="text-align: justify;">
(<b>NOTE: </b>Apparently I spent around two full days for the lack of a datasheet on the website trying to figure out why the outputs aren't coming only to realise later that the pins need to be pulled high with a 10k resistor. )</div>
<div style="text-align: justify;">
What seems so easy to do on paper is to simply get the interrupts from both the channel for a higher resolution and provide conditional loops in the interrupt routines to get the values and send the values back to main loop. </div>
<div style="text-align: justify;">
<b>WRONG. Never do this. This is bad coding. </b>The prominent reasons for this are:</div>
<br />
<ol style="text-align: left;">
<li style="text-align: justify;">The encoder readings change very fast. There is a possibility that it just might skip some values give an error reading(channel_A_state=channel_B_state=0 or 1) and count can be a wrong one or an illegal one.</li>
<li style="text-align: justify;">Checking for a high or low value in an Interrupt Service Routine is a bad choice too. The ISR's should be as small as possible. This is because, the core has been interrupted and you need to get back to look into other tasks too.</li>
</ol>
<br />
<div style="text-align: justify;">
What I would like you to point towards is <a href="http://electronics.stackexchange.com/questions/99915/stm32-rotary-encoder-with-hardware-interrupts" target="_blank">this</a> stackexchange post answer which points out to the use of using decisions in getting values of the interrupt values:</div>
<blockquote class="tr_bq">
<blockquote class="tr_bq" style="background-color: white; border: 0px; clear: both; color: #242729; font-family: Verdana, "Bitstream Vera Sans", "DejaVu Sans", Tahoma, Geneva, Arial, sans-serif; font-size: 15px; line-height: 19.5px; margin-bottom: 1em; padding: 0px; text-align: justify;">
You should have ZERO ifs. No decisions. Store your AB state, i.e., 00 or 01, then append your next state, i.e 0001 means AB went from 00 to 01,thus B changed from 0 to 1. Make this a +1. If starting from 00, and you change to 10, then call this a -1. Build a 16 element array of all possible transitions holding the number that needs to be added to your count if it occurs, noting that some are illegal and need to be handled.</blockquote>
</blockquote>
The counting is essentially on the works of the graycode. So we are going to create a graycode and check it against an array of legal and illegal states. Here's the algorithm:<br />
<br />
<ol style="text-align: left;">
<li>Create an array of the possible states in the graycode with something like this:<br /><br /><div style="text-align: left;">
int8_t states[] = {0,-1,1,0,1,0,0,-1,-1,0,0,1,0,1,-1,0}</div>
</li>
<li><div style="text-align: left;">
Also create a graycode containing the (final_state= prev_AB_state+current_AB_state). Left shift the previous AB state by 2 and append the current AB state to the final state.(If this is confusing to you, see the full code on my git.)</div>
</li>
<li><div style="text-align: left;">
Here's how a graycode looks:</div>
</li>
</ol>
<br />
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgdrAyudtCVl-Jsgo0GP4ycg6dN-Qxigg8OElWNNh_WNRCLx9T1bDOYpsdtNgZrso0P6X56oDw8PCefw8rDG5ip2rV8ShThCMnvJbj0e9SoGqYU1tzcsGmXjToEx0v6AGLV64tR8ZQgAl5_/s1600/384.png" imageanchor="1"><img border="0" height="294" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgdrAyudtCVl-Jsgo0GP4ycg6dN-Qxigg8OElWNNh_WNRCLx9T1bDOYpsdtNgZrso0P6X56oDw8PCefw8rDG5ip2rV8ShThCMnvJbj0e9SoGqYU1tzcsGmXjToEx0v6AGLV64tR8ZQgAl5_/s320/384.png" width="320" /></a><br />
<div style="text-align: left;">
<br />
So if for e.g ,<br />
<ul style="text-align: left;">
<li>The final_state is 0001, it means the prev_AB_state=00 and it has changed to 01. This would be given a legal state of -1, or <b>anti-clockwise rotation</b>. </li>
<li>The final_state is 0010, it means the prev_AB_state=00 and it has changed to 10. This would be given a legal state of 1, or <b>clockwise rotation.</b></li>
<li>Now if both channels A and B are changing then we term it as an <b>illegal state</b> as both cannot change the state together.</li>
</ul>
I would soon post a video on the same and my next post would be on setting up OpenOCD debugging and Linux Eclipse CDT setup.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
Cheers!<br />
<br /></div>
</div>
<div>
<div class="line number1 index0 alt2" style="background: none white !important; border-radius: 0px !important; border: 0px !important; bottom: auto !important; box-sizing: content-box !important; color: #cccccc; float: none !important; font-family: Consolas, "Bitstream Vera Sans Mono", "Courier New", Courier, monospace; font-size: 14.85px; height: auto !important; left: auto !important; line-height: 16.335px; margin: 0px !important; min-height: auto !important; outline: 0px !important; overflow: visible !important; padding: 0px 1em !important; position: static !important; right: auto !important; top: auto !important; vertical-align: baseline !important; white-space: pre !important; width: auto !important;">
<div style="text-align: left;">
<br /></div>
</div>
</div>
<blockquote class="tr_bq">
<blockquote class="tr_bq" style="background-color: white; border: 0px; clear: both; color: #242729; font-family: Verdana, "Bitstream Vera Sans", "DejaVu Sans", Tahoma, Geneva, Arial, sans-serif; font-size: 15px; line-height: 19.5px; margin-bottom: 1em; padding: 0px; text-align: justify;">
</blockquote>
</blockquote>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-42067337302175888552016-12-14T10:38:00.002-08:002016-12-14T10:38:18.591-08:00The Low Cost Automatic 'Wall-Painter' Robot- A Sneak Peak! <div dir="ltr" style="text-align: left;" trbidi="on">
After nearly 3 months of working and designing the 'Wall-Painter' Idea(I proposed my Idea to the Idea Labs Committee, Nirma University about 3 months ago) along with team members Dishant Kavathia and Ammar Gandhi. The objective of the Wall-Painter is simple - A dedicated app will provide templates of paint designs as an input to the Wall-Painter, and it will oblige to do the required. I had and been doing projects since the last two years, but there was always a small amount of lack of professional 'furnishing' and 'designing.' Hence, the goal this time was to give ample time in designing and refining the project at hand.<br />
<br />
The Idea required a manifestation of the Idea in my mind to a proper 3D Model first. Dishant Kavathia, a fellow teammate, has been super-amazing in making the model come right on SolidWorks and also rectify the underlying mechanical difficulties. Ammar and I would be looking at the electronic installation and the algorithms to drive the tasks. Also, I can't indulge in any more details as this is under development.<br />
<br />
Here I am attaching a few snapshots of the proposed plan as designed by us (Model made by Dishant Kavathia)<br />
<span id="goog_1266123852"></span><span id="goog_1266123853"></span><br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg74QWUucmRdn0aKgfaIEilaubysKxg9Bj9nWcVsD8jUHqDcLL5Ip9Zp_cbVu9vQgraAUC5oddREqLXh2WD4ktCKBvrkgues56bpH7304lBOoji2AdVTHpElNF-j23FweWs4tTLp2d5p1qC/s1600/base.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="165" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg74QWUucmRdn0aKgfaIEilaubysKxg9Bj9nWcVsD8jUHqDcLL5Ip9Zp_cbVu9vQgraAUC5oddREqLXh2WD4ktCKBvrkgues56bpH7304lBOoji2AdVTHpElNF-j23FweWs4tTLp2d5p1qC/s320/base.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div style="text-align: center;">
<i>Figure 1. The skeleton model</i></div>
<div style="text-align: center;">
<i><br /></i></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg68TXfing8w-Rq-L4qARN4C71Zy5xrfbBqunDaqQcWNWtSOfBf9Su6lxLys_3XUBsqPfErAvquILm0D0WJu21rpCgO8LX1-EFd7YqSl8QPoX5EnBf6USbZHpJHxnRcsF4hGHPmKjuKUcJi/s1600/frame3.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="157" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg68TXfing8w-Rq-L4qARN4C71Zy5xrfbBqunDaqQcWNWtSOfBf9Su6lxLys_3XUBsqPfErAvquILm0D0WJu21rpCgO8LX1-EFd7YqSl8QPoX5EnBf6USbZHpJHxnRcsF4hGHPmKjuKUcJi/s320/frame3.png" width="320" /></a></div>
<i></i><div style="text-align: center;">
<i><br /></i></div>
<div style="text-align: center;">
<i>Figure 2. The basic X-Y frame</i></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-61329527804422651262016-12-14T09:55:00.001-08:002016-12-14T09:55:48.817-08:00STM32 Cortex M3 Series - [STM32F103C8T6] - #1 - LED Blinking <div dir="ltr" style="text-align: left;" trbidi="on">
So finally after three months, here I am with one of the first posts on the ARM STM32 Cortex M3 series after working for about 3-4 months from now. The 8-bit controllers seem somewhat ugly in comparison to the powerful 32-bit ARM architecture, boasting a hell lot of peripherals- DMA's, OTG, Ethernet and hundreds of GPIO's(Obviously, I've still not touched them and in my learning process). This becomes a different ball game as the application for which you might be needing this for, won't require this much of computational prowess. Starting with the very 'Hello World' of every hardware project which is the LED blinking:<br /><br />
The IDE I've been using is the CooCox IDE, a very decent IDE along with the arm toolchain for arm programming with the good amount of online community for sample codes. There are two ways that we can program using this IDE:<br /><div>
<ol style="text-align: left;">
<li>Using the existing well versed and documented libraries of the stmf103xx series, which I've used currently and this blog has been helpful to me regarding the programming of the series.</li>
<li>Secondly, one could use the low, register level programming using just the stm basic library and well documented in the YouTube tutorials by <a href="https://www.youtube.com/watch?v=R6SstBoXjKc&list=PL6PplMTH29SHgRPDufZhfMRoFwRAIrzOp" target="_blank">Patrick Hood Daniel</a>.</li>
</ol>
The final goal is to light up an LED so we'll need to setup the GPIO's or the port for the LED blinking. Before we begin, it would be easier if you would have the <a href="http://www.st.com/content/ccc/resource/technical/document/reference_manual/59/b9/ba/7f/11/af/43/d5/CD00171190.pdf/files/CD00171190.pdf/jcr:content/translations/en.CD00171190.pdf" target="_blank">reference manual</a> (and not the datasheet) in the background.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhid1UUU74yDhdaIt6sOHqQSeJQw10TYn0iSndrWRKFVZw6iICJzTnYS-kqja19nznxdG1FNJMIEn4fbL6bGnaGAZUcyCrjYFI4hqm9jfIfmNIhtmO0Njt8kjBseljGsJVtI416EOKs5EuO/s1600/push.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="128" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhid1UUU74yDhdaIt6sOHqQSeJQw10TYn0iSndrWRKFVZw6iICJzTnYS-kqja19nznxdG1FNJMIEn4fbL6bGnaGAZUcyCrjYFI4hqm9jfIfmNIhtmO0Njt8kjBseljGsJVtI416EOKs5EuO/s320/push.png" width="320" /></a></div>
<ol style="text-align: left;">
<li>The Cortex M3 core is connected to different peripherals via different buses. These buses are prominent parts of arm architecture known as the AHB, or the Advanced High-performance Bus. The AHB is then connected to different APBs, which is the Advanced Peripheral Bus(APB). From the datasheet information given below:<br /><br /><div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCNXsBf-0LoOgwe3apIUnhE06fMW6fClrhr7EADwMpJxDaWBHKwlH-Yik8CqdsGdzhBcMBYouXpLFeMKkjT9co9cDCI2OpDeq_0-uPYAdvbLKpOKCXTyFD5XlxVwNEcZRd17mkFhEYK43q/s1600/apb2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="387" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCNXsBf-0LoOgwe3apIUnhE06fMW6fClrhr7EADwMpJxDaWBHKwlH-Yik8CqdsGdzhBcMBYouXpLFeMKkjT9co9cDCI2OpDeq_0-uPYAdvbLKpOKCXTyFD5XlxVwNEcZRd17mkFhEYK43q/s640/apb2.png" width="640" /></a></div>
<div style="text-align: center;">
<span style="font-size: x-small;"><i><b><span style="font-size: small;">figure 1. Architecture diagram, AHB and APB2 buses</span></b></i></span><br />
<br /></div>
We can see that the GPIO's are driven by the clock from the APB2 bus, clocking at a maximum of 72Mhz. So our, first task will be to enable the clock source for these GPIO's. </li>
<li>Next, as we are going to use the GPIO's. We need to specify the parameters for the port or pin to work - input or output, type of input or output etc. as with the case with every microcontroller. <br />The following are the types of Input and output types available on the GPIO's as given in the reference manual(Pg.158, section 9.1)</li>
</ol>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGDqZasb7YID1iuyEJQIp7CV4tRmJiwOaefX7u0KjkLzKr3U4kCjZLMP08xJL_38Uze_QFHQhPnoWkfpE5UK3E14ZiC1OcZpEV4XpoBNepK730WSvW73TElQzms3Db8kSWbXRWAeAgcMMn/s1600/push.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="256" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGDqZasb7YID1iuyEJQIp7CV4tRmJiwOaefX7u0KjkLzKr3U4kCjZLMP08xJL_38Uze_QFHQhPnoWkfpE5UK3E14ZiC1OcZpEV4XpoBNepK730WSvW73TElQzms3Db8kSWbXRWAeAgcMMn/s640/push.png" width="640" /></a></div>
<div style="text-align: center;">
<i><b>Figure 2. Input and output pin configurations</b></i></div>
<br />
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqiuoR5LbUsdZ49Q9WNYuQsT4_gTdZW_18wtJA1AmRl5O3l3eJ0jgmftHrAxhuMOSB8VS-4lhgfnpQ3pWI1cwWATbKwhgYfwn_JSwBgEXOY2Swuv38EmFUmqZigw5rrNkLrnCwxIdLnuXC/s1600/gpio.png" imageanchor="1"><img border="0" height="374" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqiuoR5LbUsdZ49Q9WNYuQsT4_gTdZW_18wtJA1AmRl5O3l3eJ0jgmftHrAxhuMOSB8VS-4lhgfnpQ3pWI1cwWATbKwhgYfwn_JSwBgEXOY2Swuv38EmFUmqZigw5rrNkLrnCwxIdLnuXC/s640/gpio.png" width="640" /></a></div>
<div style="text-align: center;">
<b><i>Figure 3. Basic input and output block Structure of the GPIO's </i></b></div>
<div style="text-align: left;">
<b><i><br /></i></b></div>
<div style="text-align: left;">
Here's a brief on the types of these pins:</div>
<div style="text-align: left;">
</div>
<ul style="text-align: left;">
<li><b>Input floating : </b>In these pins the singals are in a 'Floating' state or tri-state. Meaning the signals are of no use unless they are pulled up or down by a high source or the output is in some definite state.</li>
<li><b>Input pull-up : </b>In these type of pins, the input state when nothing is connected to it is High since a resistor is 'pulled-up' internally.</li>
<li><b>Input pull-down : </b>Same as above put pin is pulled down with a high value resistor when nothing is connected and it is internal in the structure.</li>
<li><b>Output Open-drain : </b>The output transistor is open drain and a resistor with a voltage level needs to be tied up to the drain .</li>
<li><b>Output push pull : </b>The output is open drain is only one direction meaning, when pin has to go high it has to rely on the resistor and the pin by itself cannot source. Therefor it has two transistors both to sink and to source the current. Here is a good text from <a href="http://www.edaboard.com/thread97365.html" target="_blank">this</a> site:<br />"<span style="color: #333333; font-family: "verdana" , "arial" , "tahoma" , "calibri" , "geneva" , sans-serif;"><span style="background-color: #eae8cf; font-size: 13px;">The push-pull output actually uses two transistors. Each will be on to drive the output to the appropriate level: the top transistor will be on when the output has to be driven high and the bottom transistor will turn on when the output has to go low.</span></span><br style="background-color: #eae8cf; color: #333333; font-family: Verdana, Arial, Tahoma, Calibri, Geneva, sans-serif; font-size: 13px;" /><span style="background-color: #eae8cf; color: #333333; font-family: "verdana" , "arial" , "tahoma" , "calibri" , "geneva" , sans-serif; font-size: 13px;">The advantage of the push-pull output is the higher speed, because the line is driven both ways. With the pullup the line can only rise as fast as the RC time constant allows. The R is the pullup, the C is the parasitic capacitance, including the pin capacitance and the board capacitance.</span><br style="background-color: #eae8cf; color: #333333; font-family: Verdana, Arial, Tahoma, Calibri, Geneva, sans-serif; font-size: 13px;" /><span style="background-color: #eae8cf; color: #333333; font-family: "verdana" , "arial" , "tahoma" , "calibri" , "geneva" , sans-serif; font-size: 13px;">The push-pull can typically source more current. With the open-drain the current is limited by the R and R cannot be made very small, because the lower transistor has to sink that current when the output is low; that means higher power consumption.</span> " </li>
</ul>
<ul style="text-align: left;">
</ul>
<blockquote class="tr_bq">
<blockquote class="tr_bq">
</blockquote>
</blockquote>
I would continue with the coding, in the next part as this would become a very long post.<br />
Thanks for watching and Cheers!<br />
<br />
<br />
<br />
<ol style="text-align: left;">
</ol>
</div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com2tag:blogger.com,1999:blog-3046429209072595972.post-89853798779453299892016-08-25T12:23:00.002-07:002016-08-26T09:17:30.330-07:00 RPLiDAR + ROS + Hectorslam Setup<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
So, I have been lately working on a very interesting but difficult project with team UAV Nirma , 'UAV navigation in Indoors' and the timeline is till August 2017. </div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
The first thing you might be asking is why such a costly and complicated setup?</div>
<div style="text-align: justify;">
The thing is, it is not as complicated as you think. The specific algorithm through which we want to devise is the SLAM algorithm.</div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
What SLAM does is that it maps the environment, extracts the landmarks and gets to know where it is using these landmarks. There is a very good documentation available <a href="http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/projects/1aslam_blas_repo.pdf" target="_blank">here.</a></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
Since the quad has to know where it is with respect to the landmark, we have to use a lidar and to use the lidar - ROS.</div><div style="text-align: justify;"><br></div><div style="text-align: justify;">The lidar we're using is the RPLIDAR from RoboPeak. </div><div style="text-align: justify;"><br></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
For the ones who are looking onto how to interface the RPlidar with ROS, here's a very good <a href="https://hollyqood.wordpress.com/2015/12/01/ros-slam-2-hector-slam-2d%E5%9C%B0%E5%9C%96%E5%BB%BA%E7%BD%AE/" target="_blank">tutorial.</a></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
Here's a video of the hectorslamming:</div>
<div style="text-align: justify;">
<br></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/5Sa63Ter-a8/0.jpg" src="https://www.youtube.com/embed/5Sa63Ter-a8?feature=player_embedded" frameborder="0" allowfullscreen=""></iframe></div>
<br></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-13638557779110534292016-08-01T16:12:00.003-07:002016-08-25T12:50:50.027-07:00People Counter using OpenCV and Python<div dir="ltr" style="text-align: left;" trbidi="on">
People counter system is counting system based on Image processing techniques to count human traffic at various places like Retail shops, Malls, Public places etc. The module was made using an RTSP stream from an overhead CCTV camera and processed through OpenCV library for Python. This Module was made at my internship. <br />
<br />
<br />
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidRVM-zAxg0r7OWgGhfqX6C_yKiNj-rTCdYdV_Awl91lEe5yjSlSvyJkWusfk6OgwojGUV7nPFROdnpeRr6h4-qTbKqe2w6eOa1TFAnk60N3GzUBSyImFMMSUQki0JgYBcZvadXaKxzBZO/s1600/Screenshot+from+2016-08-02+04%253A31%253A43.png" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidRVM-zAxg0r7OWgGhfqX6C_yKiNj-rTCdYdV_Awl91lEe5yjSlSvyJkWusfk6OgwojGUV7nPFROdnpeRr6h4-qTbKqe2w6eOa1TFAnk60N3GzUBSyImFMMSUQki0JgYBcZvadXaKxzBZO/s320/Screenshot+from+2016-08-02+04%253A31%253A43.png" width="320" /></a> </div>
<br />
<br />
<br />
<div style="text-align: center;">
</div>
<div style="text-align: center;">
<i>fig.1 Snap of the people counter GUI</i> </div>
<br />
<br />
<div style="text-align: center;">
<br /></div>
The live video feed is taken frame by frame to process and morphological operations were applied to each frame to get rid of noises and get proper blobs. Next, using contours method we extract the co-ordinates of each blob and track them so that the blob can be counted in a region of frame. The accuracy of the counting depends on a number of factor such as height of camera, field of vision, camera angle, lighting etc..<br />
<br />
Take a look at the video:<br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/Ub3rOGQpZUI/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/Ub3rOGQpZUI?feature=player_embedded" width="320"></iframe></div>
<div style="text-align: center;">
<br /></div>
<br /></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com24tag:blogger.com,1999:blog-3046429209072595972.post-74622750813752207232016-06-26T16:44:00.001-07:002016-06-27T00:31:35.313-07:00Glance time/Attention time span using OpenCV and python<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
This is first among the many modules which I have made with my team during at <a href="http://www.tethr.it/" target="_blank">TetherBox Technologies</a>- An awesome IoT startup in heart of Bangalore!</div>
<div style="text-align: justify;">
So Myself, along with my colleague here have been building the people counter system since April 2016 and before I go into it later next month, I thought of posting about the glance time or the face attention span module I made here. It is to be deployed at major companies who want to get insights of customers looking into their products, albeit their attention span to their most important products at display.</div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">I have just used haar cascade frontal face classifier and the python time module. </div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
Here's the video:</div>
<br>
<br>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/WoIHQOIXDyk/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/WoIHQOIXDyk?feature=player_embedded" width="320"></iframe></div>
<div style="text-align: center;">
<br></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com2Bengaluru, Karnataka 560001, India12.9715987 77.59456269999998312.4764182 76.949115699999979 13.4667792 78.240009699999987tag:blogger.com,1999:blog-3046429209072595972.post-58719557744810267572016-06-13T13:33:00.001-07:002016-06-13T22:47:52.761-07:00Twin Motored RC plane-Toothless, by Team Avalon!<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Hello Guys, no post since a long time and we (Team Avalon) were working on making an RC plane since January, preparing for the SAE Aeromodelling, Chennai. We won the best design award.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
I'll be short here with the specs, followed by the flight video. Specs:</div>
<ol style="text-align: justify;">
</ol>
1. 1000kv Brushless motors AVIONICS x2<br />
<ol style="text-align: justify;">
</ol>
<div style="text-align: justify;">
<a href="http://www.rcbazaar.com/products/2416-avionic-c283012-kv1000-brushless-motor.aspx" target="_blank">See Motors here</a></div>
<div style="text-align: justify;">
</div>
<div style="text-align: justify;">
2. 30 Amps ESC's AVIONICS x2</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<a href="http://www.rcbazaar.com/product.aspx?productid=2445" target="_blank">See ESC's here</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
3. 2200 mAh Li-Po Battery 25 C burst rating</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
4. 9gm AVIONICS servos x4</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
5. Turnigy 9x Receiver and Transmitter</div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimrAP4Rdlom4hE02TXRvwDVK_WUuPiM1nG9rlaPp3Q-sCXOpbG_A-c0OOdriNKLO_zS8XpGO4YtWldMmuHLLOspr6g04YjNCDCEFPYyA9bH2NZCO-9KLarmqB-lFtYDfcspVmXSB0RphHA/s1600/f.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimrAP4Rdlom4hE02TXRvwDVK_WUuPiM1nG9rlaPp3Q-sCXOpbG_A-c0OOdriNKLO_zS8XpGO4YtWldMmuHLLOspr6g04YjNCDCEFPYyA9bH2NZCO-9KLarmqB-lFtYDfcspVmXSB0RphHA/s320/f.jpg" width="320" /></a></div>
<div style="text-align: center;">
<i>Toothless</i></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXOaGlYgXIQlfG_LCqBhIebC7H-DUXlk77Dp8cYqshke4FDmOFlfegqHWCzPsEMym7IKIyN9c7i2f79Z9GgpeTPL3n5Ob74MufF2jL-a4jii1K5H8tkyRuwVhDYWswoe1rBk5P8df4v8h_/s1600/j.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXOaGlYgXIQlfG_LCqBhIebC7H-DUXlk77Dp8cYqshke4FDmOFlfegqHWCzPsEMym7IKIyN9c7i2f79Z9GgpeTPL3n5Ob74MufF2jL-a4jii1K5H8tkyRuwVhDYWswoe1rBk5P8df4v8h_/s320/j.jpg" width="320" /></a></div>
<div style="text-align: center;">
<i>Top View</i></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0WDtC4U1gwjn-UTtW9N09rjqnfg4EyafxxEEujvGEyZSPrjj-VrqzVBCafNvuPtwyfY5-MPmXB4Ukz8OsF2Sfd6qd7-Q-NW4GCOUMAYe4IBgOaM2jOQpoZzlmKKjjyanYIQbZJWK2UVeG/s1600/IMG-20160313-WA0000.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="199" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0WDtC4U1gwjn-UTtW9N09rjqnfg4EyafxxEEujvGEyZSPrjj-VrqzVBCafNvuPtwyfY5-MPmXB4Ukz8OsF2Sfd6qd7-Q-NW4GCOUMAYe4IBgOaM2jOQpoZzlmKKjjyanYIQbZJWK2UVeG/s320/IMG-20160313-WA0000.jpg" width="320" /></a></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<i>Team Logo</i></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5SpXHpBgY1S3lQvTQ6PM5uvbCnrEnNrH_TvavTpTIRC_vBWHr9IYT9Ji1Rwe0XbMGe_fzJevbEE71oIMQ24fOtuIzXuWegI0s36PnSkl-ybBzSHh6lw2ftEE-SmPUnjybUYevRYWWgO00/s1600/team.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5SpXHpBgY1S3lQvTQ6PM5uvbCnrEnNrH_TvavTpTIRC_vBWHr9IYT9Ji1Rwe0XbMGe_fzJevbEE71oIMQ24fOtuIzXuWegI0s36PnSkl-ybBzSHh6lw2ftEE-SmPUnjybUYevRYWWgO00/s320/team.jpg" width="320" /></a></div>
<div style="text-align: center;">
<i>The proud Team</i></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<i>Watch the first flight !(Test Version)</i></div>
<div style="text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/lJsIa8GX2D4/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/lJsIa8GX2D4?feature=player_embedded" width="320"></iframe></div>
<div style="text-align: center;">
</div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com2tag:blogger.com,1999:blog-3046429209072595972.post-69232057639330711462016-04-29T08:12:00.000-07:002016-06-13T13:35:12.642-07:00Guide to Installing OpenCV+python setup on Windows [Video .avi/.mp4 problems Solved]<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Hello Everyone. This will be my post after a long time and this time I've come up with something which I had got my nerves on while installing. The reason for creating this post was mainly the unavailability of proper installation procedure(Shocking!). This library isn't like those '.exe' softwares where a single installation would directly run the image processing tool. If you're new to OpenCV, no problem, I've got you covered.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<u><b>1)Install python 2.7</b> </u>:</div>
<div style="text-align: justify;">
Download the python 2.7 package from the python official site. Download according to your system configurations. Download and install from <a href="https://www.python.org/download/releases/2.7/" target="_blank">here</a>.</div>
<div style="text-align: justify;">
Note: Install the Windows X86-64 MSI Installer (2.7.0) package if you're on a Windows 64 bit version. </div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: center;">
<div style="text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5kPOQ4g4JkGKI6XSRiSD2tEChZ2A8mI0AwzRpp4tlw4QBQcVZ0EcJHEEhn-J_CxgMLXdvujLFcv3ZFpEDOD6W3JIKwYFXRq7wy-WXS43My8wWvVEPLDhl7d1bo99OM42Zoqh6ztN7riAh/s1600/Untitled.jpg" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5kPOQ4g4JkGKI6XSRiSD2tEChZ2A8mI0AwzRpp4tlw4QBQcVZ0EcJHEEhn-J_CxgMLXdvujLFcv3ZFpEDOD6W3JIKwYFXRq7wy-WXS43My8wWvVEPLDhl7d1bo99OM42Zoqh6ztN7riAh/s640/Untitled.jpg" width="640" /></a></div>
</div>
<div style="text-align: center;">
<div style="text-align: justify;">
<br /></div>
</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: left;">
<div style="text-align: justify;">
<u><b>2)Download the latest (Version 3.0) OpenCV stable version:</b></u></div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
This step downloads the library in zip format which contains all the dependencies to be installed for successfully running OpenCV. Here's the link to the download page.<a href="http://sourceforge.net/projects/opencvlibrary/files/opencv-win/3.0.0/opencv-3.0.0.exe/download" target="_blank">Link</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Next, extract the package on desktop and the extracted folder will be 'opencv'.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBfbR8nVheqBgsNrNEIsAWXc8nL0OxgfgroHDahOXJ5kXriRYMJuldW22Bn1xFilrwq_Squ57lz0EU5JJT_XT7YHxAyU7idXzrP066jZW8EZ8hsiJb1S7dxM0la_nfGBeoaDQ7hiq2MoB9/s1600/Untitled.jpg" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBfbR8nVheqBgsNrNEIsAWXc8nL0OxgfgroHDahOXJ5kXriRYMJuldW22Bn1xFilrwq_Squ57lz0EU5JJT_XT7YHxAyU7idXzrP066jZW8EZ8hsiJb1S7dxM0la_nfGBeoaDQ7hiq2MoB9/s640/Untitled.jpg" width="640" /></a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgl4NZTMspBJARBqn4iL-M2VsMf2AvmhDdz3hbuNyQlkSEvQLOwvLJnpCo37Ub5sebvFS0gXTLOmfLj49lwLJ5wiEIcqLhA1gBfi_cU7cIKaeGwsTOocUcpbvuqfEl6_iIxAT47gonU5med/s1600/Untitled.jpg" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgl4NZTMspBJARBqn4iL-M2VsMf2AvmhDdz3hbuNyQlkSEvQLOwvLJnpCo37Ub5sebvFS0gXTLOmfLj49lwLJ5wiEIcqLhA1gBfi_cU7cIKaeGwsTOocUcpbvuqfEl6_iIxAT47gonU5med/s640/Untitled.jpg" width="640" /></a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: center;">
<div style="text-align: justify;">
<b><i>extracting the opencv package</i></b></div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
<b><i><br /></i></b></div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
<b><i><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOqTmXkaNwXHYtXNgq9a6e33nSaD-AsWVWPtL8r6JNCvLfpjje50AXcbhFuWDIX9boVszeUmJ4HPizQu5_pHV7DbkUYhvtiWNhIX3fhEIIeUL0kIbRYesVtiyPGl0gh9_xrzUu7-KRCJXw/s1600/Untitled.jpg" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOqTmXkaNwXHYtXNgq9a6e33nSaD-AsWVWPtL8r6JNCvLfpjje50AXcbhFuWDIX9boVszeUmJ4HPizQu5_pHV7DbkUYhvtiWNhIX3fhEIIeUL0kIbRYesVtiyPGl0gh9_xrzUu7-KRCJXw/s640/Untitled.jpg" width="640" /></a></i></b></div>
</div>
<div style="text-align: center;">
<div style="text-align: center;">
<b><i>folder after extraction</i></b></div>
</div>
<div style="text-align: center;">
<div style="text-align: center;">
<b><i><br /></i></b></div>
</div>
<div style="text-align: center;">
<div style="text-align: justify;">
<b><i><br /></i></b></div>
</div>
<div style="text-align: justify;">
<b>3)<u> Download numpy library for python:</u></b></div>
<br />
<div style="text-align: justify;">
<span style="font-weight: bold; text-decoration: underline;"><br /></span></div>
<div style="text-align: justify;">
Download the numpy library which is the library for using matrices for python. Since you'll be using arrays extensively for image processing applications, this library is a must and no applications or functions will run without this library,</div>
<br />
<div style="text-align: justify;">
Download the package from <a href="https://sourceforge.net/projects/numpy/files/NumPy/1.9.2/numpy-1.9.2-win32-superpack-python2.7.exe/download" target="_blank">here</a> and directly install it into the directory where python is installed.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Note: When installing, the installation directory is default where your python is installed. So no need to change it.</div>
<div style="text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTAULUty_Zy2WemJ-uU_IVPuMWtqbPGqFSDnkfKJ6JzvVqUn-jaD4Y4FSKr9mtFw2JmSbzWJfkVoHMHZAQe4Pus2xbfLtVDVOAbp9jhVGx2eaN1UC6btHGU1_GSLWDLFczTKR0IhJYCKU0/s1600/Untitled.png" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTAULUty_Zy2WemJ-uU_IVPuMWtqbPGqFSDnkfKJ6JzvVqUn-jaD4Y4FSKr9mtFw2JmSbzWJfkVoHMHZAQe4Pus2xbfLtVDVOAbp9jhVGx2eaN1UC6btHGU1_GSLWDLFczTKR0IhJYCKU0/s640/Untitled.png" width="640" /></a></div>
<div style="text-align: justify;">
<b>4) <u>Download FFpmeg :</u></b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
This might probably the most important step in installing OpenCV on windows. FFmpeg is the required library for running video applications on OpenCV + python setup.</div>
<div style="text-align: justify;">
See <a href="http://www.wikihow.com/Install-FFmpeg-on-Windows" target="_blank">here</a> on how to install ffmpeg on windows.Please go through the tutorial, as I'm not covering here. Remember, this file is very <b>important </b>for video processing applications.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<b>5) <u>Setting up files for getting ready:</u></b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Now that you have installed all the requires libraries and packages. Now it's time to move some files.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<b>i) </b> <b>goto: Desktop\opencv\build\python\2.7\x86</b></div>
<div style="text-align: justify;">
and copy the pyd file,</div>
<div style="text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2urhoVVC5hc2z9GsNZ612_NHxh67nXzNZEYDGC5Hozyv76jvWoisIIIJYdXxQ25Tx_AKUujRUechNDYu_ZfDotO0tN0nKjZZL8pwgLRCj0_YM44a2Jj_AKD7KX1SCABXILVS205aooi07/s1600/Untitled.png" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2urhoVVC5hc2z9GsNZ612_NHxh67nXzNZEYDGC5Hozyv76jvWoisIIIJYdXxQ25Tx_AKUujRUechNDYu_ZfDotO0tN0nKjZZL8pwgLRCj0_YM44a2Jj_AKD7KX1SCABXILVS205aooi07/s640/Untitled.png" width="640" /></a></div>
<br />
<div style="text-align: justify;">
<span style="font-weight: bold;"><br /></span></div>
<div style="text-align: justify;">
<b>ii) goto: C:\Python27\Lib\site-packages</b></div>
<br />
<div style="text-align: justify;">
and paste the previously copied pyd file.</div>
<br />
<div style="text-align: justify;">
<span style="font-weight: bold;"><br /></span></div>
<div style="text-align: justify;">
<b><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJNajVGPOlpKJgqtonLVlZvMYv1yt0r4iLTs8ExfShCfhdZdhlGrA0GYMZMNKk9rPx1WzIephoNkG5pcY4GIaYII7KdpOi6drLtseZLPlTQT9AUh1XUj5_sKAXNZgZk5EBO9W9jbBOwb4G/s1600/Untitled.png" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJNajVGPOlpKJgqtonLVlZvMYv1yt0r4iLTs8ExfShCfhdZdhlGrA0GYMZMNKk9rPx1WzIephoNkG5pcY4GIaYII7KdpOi6drLtseZLPlTQT9AUh1XUj5_sKAXNZgZk5EBO9W9jbBOwb4G/s640/Untitled.png" width="640" /></a> </b> </div>
<br />
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<b>iii) goto: Desktop\opencv\build\bin and copy all the .dll files</b></div>
<div style="text-align: justify;">
<b>and paste them to each C:\python27 and the ffmpeg/bin folder you created in step 4).</b></div>
<div style="text-align: justify;">
<b><br /></b>
</div>
<div class="separator" style="clear: both; text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj4YoImEnes79dOF7xVewrz9R8_DnEXhOXDQkWI8bOWa5R5VtuSmGAohkaUp6oRCmPx8Md76CDQC5kGjLRLdd7J96oIsVwHURGb4N253q7nvX99Rcx7yemrWBc6ggbVcrgdp8w2PufBoqJX/s1600/Untitled.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj4YoImEnes79dOF7xVewrz9R8_DnEXhOXDQkWI8bOWa5R5VtuSmGAohkaUp6oRCmPx8Md76CDQC5kGjLRLdd7J96oIsVwHURGb4N253q7nvX99Rcx7yemrWBc6ggbVcrgdp8w2PufBoqJX/s640/Untitled.png" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div style="text-align: center;">
<div style="text-align: justify;">
<b><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj4YoImEnes79dOF7xVewrz9R8_DnEXhOXDQkWI8bOWa5R5VtuSmGAohkaUp6oRCmPx8Md76CDQC5kGjLRLdd7J96oIsVwHURGb4N253q7nvX99Rcx7yemrWBc6ggbVcrgdp8w2PufBoqJX/s1600/Untitled.png" imageanchor="1"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4L1hiAz9cXL7t4Ke_QybcORgHhr7JZpUJeSuB4XFEK9ZGSXlfnXq2lLXThjPuzhOiGn6ulQ-hDP1hz_2CGl9FDsk7N8Ukq-3RwB5cbIJyYSEXxRHfa1IisURQ7A6dhztvWHZR1JKgb6vH/s1600/Untitled.png" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4L1hiAz9cXL7t4Ke_QybcORgHhr7JZpUJeSuB4XFEK9ZGSXlfnXq2lLXThjPuzhOiGn6ulQ-hDP1hz_2CGl9FDsk7N8Ukq-3RwB5cbIJyYSEXxRHfa1IisURQ7A6dhztvWHZR1JKgb6vH/s640/Untitled.png" width="640" /></a></b></div>
</div>
<div style="text-align: center;">
<div style="text-align: justify;">
<br /></div>
</div>
<div style="text-align: center;">
<div style="text-align: center;">
<b><i>copy the files to c:\ffmpeg\bin folder.</i></b></div>
</div>
<div>
<div style="text-align: justify;">
<b><i><br /></i></b></div>
</div>
<div>
<div style="text-align: justify;">
<b>5) <u>Run a Sample:</u></b></div>
</div>
<div>
<div style="text-align: justify;">
<b>goto: \Desktop\opencv\sources\samples\python2 and run any of the files. Here I'm showing you the optical flow example.</b></div>
</div>
<div>
<div style="text-align: justify;">
<br /></div>
</div>
<div>
<div style="text-align: justify;">
(Right click on the python file >Edit with IDLE> *python script will pop-up* > press F5 to start)</div>
</div>
<div>
<div style="text-align: justify;">
<br /></div>
</div>
<div style="text-align: center;">
<div style="text-align: justify;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirWlrPN7ETgdzmJ_dQfUJc1xDO-NU-O2Ue0z-VBc8XiFaswAY6DPR0GVhCNpoOzpfZWyYrgTfA0L2vYmxbqzIrYl4eIGDyGNGqTOLI1Pby2CmuUW7waB1lkNUjDp3kd15J7RqsxRWqVpi2/s1600/Untitled.png" imageanchor="1"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirWlrPN7ETgdzmJ_dQfUJc1xDO-NU-O2Ue0z-VBc8XiFaswAY6DPR0GVhCNpoOzpfZWyYrgTfA0L2vYmxbqzIrYl4eIGDyGNGqTOLI1Pby2CmuUW7waB1lkNUjDp3kd15J7RqsxRWqVpi2/s640/Untitled.png" width="640" /></a></div>
</div>
<div style="text-align: justify;">
<b><br /></b>
</div>
<div style="text-align: center;">
<div style="text-align: center;">
<b><i>optical flow example</i></b></div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
If you're still getting problems then mention it in the comments I'll try to solve them.</div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
<b>Thanks for watching,</b></div>
</div>
<div style="text-align: left;">
<div style="text-align: justify;">
<b>Cheers!</b></div>
</div>
</div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com4tag:blogger.com,1999:blog-3046429209072595972.post-69611779334407273552016-04-07T14:01:00.003-07:002016-04-08T06:13:27.295-07:00Traffic Count using OpenCV Python 'Moments' method | Code | Video<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
<span style="font-family: inherit;">I'm currently working on various simple image processing modules at my internship and got permission to showcase my simple traffic count using a slightly different method than conventional blob analysis or contours method, which is moments method. The moments method works on the geometries of the image like centroid, area etc.</span></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
The following is the algorithm that i've applied:</div>
<div style="text-align: left;">
</div>
<ol style="text-align: left;">
<li style="text-align: justify;"><b>Get the Video frame by frame,</b> thereby applying processing techniques for every frame/image.</li>
<li style="text-align: justify;"><b>Apply Background subtraction</b>. Generally when static background is present as in the case of a static CCTV camera, to get a binary image for the moving vehicles. We simply subtracted a static background from the current image thus leaving us with just the moving vehicles.</li>
<li style="text-align: justify;"><b>Next, apply moments function to each frame</b>, therefore getting the centroid of the moving vehicles( binary image).</li>
<li style="text-align: justify;"><b>Finally assign a certain range of pixel values in (x,y) </b>form on the frame so that when the centroid of the moment area crosses this range the counter increments by one which is reflected in the counter text. </li>
</ol>
<div style="text-align: justify;">
<b>CODE:</b></div>
<div style="text-align: justify;">
<b><a href="https://github.com/kartikmadhira1/Traffic-Counter_OpenCV" target="_blank">Click Here</a></b></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
<b>VIDEO:</b></div>
<div style="text-align: justify;">
<b><br></b></div>
<div style="text-align: justify;">
Here's a small video on how it works.</div>
<div style="text-align: justify;">
<br></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/tMmSMPxmLlo/0.jpg" src="https://www.youtube.com/embed/tMmSMPxmLlo?feature=player_embedded" frameborder="0" allowfullscreen=""></iframe></div>
<div class="separator" style="clear: both; text-align: center;">
<br></div>
<div class="separator" style="clear: both; text-align: center;">
<br></div>
<div class="separator" style="clear: both; text-align: left;">
<br></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
<br></div>
<div style="text-align: justify;">
<br></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com8tag:blogger.com,1999:blog-3046429209072595972.post-26647576889587683232016-04-07T13:29:00.000-07:002016-04-07T13:29:59.372-07:00Simple Ball Tracking Robot | Code | Part 3 <div dir="ltr" style="text-align: left;" trbidi="on">
Hello guys, this is last and final part to the Simple Ball tracking robot. This robot though simple, took me around 3 months to build. Obviously with exams, Projects , deadlines and internship made it longer. Finally i'm uploading the code on Github.<br />
The GitHub account contains two codes in the folder. First is the OpenCv Python code for tracking and the second one contains the serial receiving code for the Arduino Uno.<br />
Also, i have put a trackbar file so that you can set HSV values according to the color of the ball which you want to track.<br />
In the future i'll make an extension to this project with the final full fledged robot included in the robot.<br />
<a href="https://github.com/kartikmadhira1/Ball-Tracker_OpenCV" target="_blank">Click here for the Code</a><br />
<br />
<br />
<br /></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com6tag:blogger.com,1999:blog-3046429209072595972.post-72936454754499131662016-04-03T12:54:00.002-07:002016-04-03T12:54:40.105-07:00GitHub Account for all Projects<div dir="ltr" style="text-align: left;" trbidi="on">
I'm sharing here the link to all the repositories(projects) that I've made till date (Major Ones).<div>
<br /></div>
<div>
<a href="https://github.com/kartikmadhira1" target="_blank"><span style="font-size: x-large;">GitHub Link </span></a></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-23377808371696966272016-03-06T12:50:00.001-08:002016-03-06T12:50:39.651-08:00Simple Ball Tracking Robot | Video | Part 2<div dir="ltr" style="text-align: left;" trbidi="on">
It's been almost two months since I last updated on this because of ENIGMA and the conference I had to attend.Now I have successfully come up with an awesome tracking on objects and this project although seems simple took me about three months to learn from scratch.<br />
<div>
Here's the final look of the Robot:</div>
<div>
<br /></div>
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivzV0j7nQJ94n1qbugtlL7NFsr0Woa-Ki2RG2lEJNSKH7rjQst7jls2qYRvd0QdThhYo3RUre_24V_57CGNljzklYphA5CVoUXMVxKDVZhWNFI9lvPsldHVH6kBIUPCuM2Ks0A5UNm0QdR/s1600/IMG_20160306_205125.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivzV0j7nQJ94n1qbugtlL7NFsr0Woa-Ki2RG2lEJNSKH7rjQst7jls2qYRvd0QdThhYo3RUre_24V_57CGNljzklYphA5CVoUXMVxKDVZhWNFI9lvPsldHVH6kBIUPCuM2Ks0A5UNm0QdR/s320/IMG_20160306_205125.jpg" width="236" /></a></div>
<div style="text-align: center;">
<i>Isn't she awesome?</i></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
After this tutorial I'll finally conclude this project as the motor testing is remaining and i'll add some basic image processing techniques to make such a robot and trust me it's really easy. Don't get frightened by Raspberry Pi.</div>
<div>
I'll add code in the next and final part.</div>
<div>
<br /></div>
<div>
Here's the Video:</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/1b16ecLC3Mg/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/1b16ecLC3Mg?feature=player_embedded" width="320"></iframe></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Do Comment, rate and subscribe!</div>
<div>
<br /></div>
<div>
<br /></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com5tag:blogger.com,1999:blog-3046429209072595972.post-29807009244733871672016-02-26T09:51:00.002-08:002016-02-26T09:51:54.767-08:00PID Line Follower V2 | ENIGMA | Arduino Mega<div dir="ltr" style="text-align: left;" trbidi="on">
Hello Folks! The previous version of the Line follower robot was clumsy and PID tuning was also tiresome therefore, Myself along with my friend Ammar Gandhi proposed to make an Online tuning facility so that directly PID values can be given to the bot.<br />
The basic algorithm remains the same, as well as the code with minor changes for the pushbuttons and LCD.<br />
Here I am sharing the pics from the process throughout which we built the Bot.<br />
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHPud8fUTnav-yags6fbQQd_5S5sQLHgbmB-UZ4djwqI-ZQmB5TuSY5WLNmvp6_3Vivkt9i21WaIht1gkdZo3t5JYGQlKVIH0FxsoFgXD39ahsigcHyiGz0xbZ2WDIJ6sFttBsgABgULI7/s1600/IMG_20160215_153446_HDR.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHPud8fUTnav-yags6fbQQd_5S5sQLHgbmB-UZ4djwqI-ZQmB5TuSY5WLNmvp6_3Vivkt9i21WaIht1gkdZo3t5JYGQlKVIH0FxsoFgXD39ahsigcHyiGz0xbZ2WDIJ6sFttBsgABgULI7/s320/IMG_20160215_153446_HDR.jpg" width="236" /></a></div>
<div style="text-align: center;">
<i>Cutting Plywood for the frame.</i></div>
<div style="text-align: center;">
<i><br /></i></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<i><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGkPtAedv4uQnrDL5dEzFWzEutO2995lvFDISQufB_hS3elHiuuDlMXKnNXfGUg3dZQ-VuU4-wU1VsvTkymkNtsWBJfF5ur6DzqY0-tsunS0rC46VnyQJSYLqbeoVD21gwxhW-UlEzi6w3/s1600/IMG_20160217_101329_HDR.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGkPtAedv4uQnrDL5dEzFWzEutO2995lvFDISQufB_hS3elHiuuDlMXKnNXfGUg3dZQ-VuU4-wU1VsvTkymkNtsWBJfF5ur6DzqY0-tsunS0rC46VnyQJSYLqbeoVD21gwxhW-UlEzi6w3/s320/IMG_20160217_101329_HDR.jpg" width="236" /></a></i></div>
<div style="text-align: center;">
<i>Raw Frame</i></div>
<div style="text-align: center;">
<i><br /></i></div>
<div style="text-align: center;">
<i><br /></i></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<i><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMz-2sd4580awAHdYOfk5JgzhMJeUOUXRhHGjo8NP0AdwX1aVrrTWAikWROOLxpsSGJB9gEJKhUoqH5hXEsKuiH0ls83aeTC8jNwlHtNMQCwMmCKQFtHFnzb3F7k52YK6SCAPdVt-3gNHL/s1600/IMG_20160218_011203_HDR.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMz-2sd4580awAHdYOfk5JgzhMJeUOUXRhHGjo8NP0AdwX1aVrrTWAikWROOLxpsSGJB9gEJKhUoqH5hXEsKuiH0ls83aeTC8jNwlHtNMQCwMmCKQFtHFnzb3F7k52YK6SCAPdVt-3gNHL/s320/IMG_20160218_011203_HDR.jpg" width="236" /></a></i></div>
<div style="text-align: center;">
<i>Electronics Installation</i></div>
<div style="text-align: center;">
<i><br /></i></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjm6MwOuqTUBq6pSHAYD1XxjIs2J9ic0o2cmnp8HahObPtp_cKSayCHBrmeXBZ_RAVpT1QDWcVkaoEH5Boipni8uPG6F3CZd1247hUwq9DImeMHfEuL00roRmCmbMh06OMZth5bbA5N_O_Q/s1600/IMG_20160226_220623.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjm6MwOuqTUBq6pSHAYD1XxjIs2J9ic0o2cmnp8HahObPtp_cKSayCHBrmeXBZ_RAVpT1QDWcVkaoEH5Boipni8uPG6F3CZd1247hUwq9DImeMHfEuL00roRmCmbMh06OMZth5bbA5N_O_Q/s320/IMG_20160226_220623.jpg" width="236" /></a></div>
<div style="text-align: center;">
<i>Final Robot '</i>ENIGMA'<br />
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
Here's a short video on the working of the robot.</div>
</div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/yUhUHgo1_yo/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/yUhUHgo1_yo?feature=player_embedded" width="320"></iframe></div>
<br />
<div style="text-align: left;">
Here's another one.</div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/PqTxu9Z2MqQ/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/PqTxu9Z2MqQ?feature=player_embedded" width="320"></iframe></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Code for the Robot can be seen here,</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<a href="https://github.com/kartikmadhira1/Line-Follower" target="_blank">Click Here and open 'enigma' folder</a></div>
<div style="text-align: center;">
<br /></div>
</div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com3tag:blogger.com,1999:blog-3046429209072595972.post-3252921826426791032016-01-30T14:19:00.001-08:002016-01-30T14:22:29.686-08:00Line Follower | PID Algorithm | Arduino Code<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
I've been lately working on making a line follower that can run on any track be it black line on white background or vice versa and recently won a line follower event at my college. Some people asked me for the code and hence writing a post on the same. There are obviously minor adjustments to made and other tweaks but the code is perfectly running and the PID tuning done after long hours of trial and testing. Here is the list of hardware and software used.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
NOTE that the PID constants that I have tuned for my bot may or may not work for your bot so if you're going to use the code then make sure you tune the constants properly.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<b><u>HARDWARE:-</u></b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
1. <u>Line Sensor Array</u></div>
<br />
<div style="text-align: justify;">
<span style="text-decoration: underline;"><br /></span></div>
<div style="text-align: justify;">
I have used a line sensor array consisting of 7 IR leds. The reason for using such an array is that sharper turns can be detected easily using the 7 sensors and overshooting of the bot from the path gets reduced. One can also use just two led's on either side of the line but it'll work only for acute turns and not obtuse or right angled turns. The thing with line follower is that even if you tune it with perfect constants it'll still wobble a bit while running, so the 7 IR's sort help in reducing oscillations also.</div>
<br />
<div style="text-align: center;">
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh5ytMjUk5a7v-OuTED5D_oRI3Vwq9kFdKxntdt4O4tElrXLAaVOhwfYDV3q-_mxYpfT7CUNqArA7OBW_NMygDAD-ppy-XttnFovvg_vqQRsZRAboOw_WBPfQ766jMBHEb1_UaQJIDF8fvs/s1600/IMG_20160130_145807_HDR.jpg" imageanchor="1"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh5ytMjUk5a7v-OuTED5D_oRI3Vwq9kFdKxntdt4O4tElrXLAaVOhwfYDV3q-_mxYpfT7CUNqArA7OBW_NMygDAD-ppy-XttnFovvg_vqQRsZRAboOw_WBPfQ766jMBHEb1_UaQJIDF8fvs/s200/IMG_20160130_145807_HDR.jpg" width="200" /></a></div>
</div>
<div style="text-align: justify;">
One important thing I experienced is that DO NOT place the motors and the sensor array nearer to each other. There should be enough distance between the motors and the sensor placing. Hence, I placed it in the front and the motor at the back. The reason behind this is that this will give enough amount of time for motors to react as the sensors detect sharp turns.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
2. <u>L298 and 200 rpm motors</u></div>
<br />
<div style="text-align: justify;">
<span style="text-decoration: underline;"><br /></span></div>
<div style="text-align: justify;">
As I have explained in the previous post the l293d is an inefficient driver when it comes to handling large currents and moreover there's considerable amount of voltage drop at the PNP and NPN transistors inside. Check it <a href="http://electronics.stackexchange.com/questions/107162/l293d-problem-cant-run-both-motors-together" target="_blank">here</a></div>
<br />
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
There's no specific reason as to why I chose 200 rpm motors. They were just lying around in my room so I used them, although increasing the rpm may increase the difficulty in making it perfectly follow the line.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
3. <u>Generic wheels and Chassis</u></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
4. <u>11.1 v Li-po battery</u><br />
<u><br /></u>
5. <u>Arduino Uno</u><br />
<u><br /></u></div>
<div style="text-align: center;">
<div style="text-align: center;">
<u><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjDEkm63QaBFji3-tQ5kEMtzgrOM4ASDXYgw50lXau9wCwa2C76apG-y9kmQu7KsF5UCN983pan4hbldS26Er0l-IEM8aGOWvF6m9VURi_Ik8fXTNeFP4NTQpaIswOrogH3rXQ080H3YgxS/s1600/IMG_20160130_145903_HDR.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjDEkm63QaBFji3-tQ5kEMtzgrOM4ASDXYgw50lXau9wCwa2C76apG-y9kmQu7KsF5UCN983pan4hbldS26Er0l-IEM8aGOWvF6m9VURi_Ik8fXTNeFP4NTQpaIswOrogH3rXQ080H3YgxS/s320/IMG_20160130_145903_HDR.jpg" width="236" /></a></u></div>
</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<b><u>Basic sensing:-</u></b></div>
<br />
<div style="text-align: justify;">
<span style="font-weight: bold; text-decoration: underline;"><br /></span></div>
<div style="text-align: justify;">
There's no specific reason as to why I chose PID as a control strategy. I found this through other similar bots that this is one of the best strategies to efficiently follow the line. Before I get into PID this is how I assigned values to the sensor readings.</div>
<br />
<div style="text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggAqD2GVSXSieD2odoxRSsso97GWMaWi9lFEl1RLCLgAs7iI_S8zpHX3zguui3tpLPs9GJ08exdVRhBl-Oew1ch1zmoBjvc_Iqt6tPBYGPnBaHOJjwRwEyKyGWiDS7VcnrMRD7J6YAEQFa/s1600/Untitled.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="172" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggAqD2GVSXSieD2odoxRSsso97GWMaWi9lFEl1RLCLgAs7iI_S8zpHX3zguui3tpLPs9GJ08exdVRhBl-Oew1ch1zmoBjvc_Iqt6tPBYGPnBaHOJjwRwEyKyGWiDS7VcnrMRD7J6YAEQFa/s320/Untitled.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
i) The mean position sensor is assigned an integer value of error=20.</div>
<div class="separator" style="clear: both; text-align: justify;">
ii) Same way the other six sensor are calibrated for values from 35(rightmost sensor) to 5.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
There final error generated is,</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">final_error=error-20;</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<b><u>PID Algorithm:-</u></b></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
1.<u>Proportional Term,Kp :-</u> </div>
<div class="separator" style="clear: both; text-align: justify;">
Proportional term generates output according to the final error generated in previously. But the problem is when the bot is at the Center IR the output is zero.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">Output=Kp*(final_error);</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
2.<u>Derivative Term,Kd:-</u></div>
<div class="separator" style="clear: both; text-align: justify;">
Derivative term dampens the system meaning it'll lessen the oscillations or wobbling occurring while it is following the path. Again this term wont help when final_error will be zero.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
</div>
<div style="text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">Output=Kd*(final_error-prev_error);</span></div>
<span style="font-family: "courier new" , "courier" , monospace;"></span><br />
<div style="text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">prev_error=final_error</span></div>
<span style="font-family: "courier new" , "courier" , monospace;">
</span><br />
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;">3. <u>Integral Term,Ki:-</u></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;">Integral term does what the the other two terms can't do, it integrates the error over time when the error is zero.(Previous values aren't zero).It's nothing but summing of all values.</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">integral+=error;</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">Output=Ki*(integral);</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;">Final output is ,</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">Output=</span><span style="font-family: "courier new" , "courier" , monospace;">Kp*(final_error)+</span><span style="font-family: "courier new" , "courier" , monospace;">Kd*(final_error-prev_error)+</span><span style="font-family: "courier new" , "courier" , monospace;">Ki*(integral);</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;">This output is fed accordingly to the left or right motor respectively through PWM.</span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<b><u>PID Tuning:-</u></b></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
This is the most cumbersome and time-consuming part of the whole process of making a Line-follower. There's no strict method to tune it. However, here are few steps i found on the net:</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
i) Make all the constants 0 and start with the Kp term randomly and see how he bot behaves. If the bot overshoots the path then increase Kp or if it's too oscillatory then decrease Kp.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
ii) Next come over to Kd, and make it such that the wobbling is lowered and bot turns perfectly.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
iii) Though I have not been able to master this term but this term is to make the go fast on a straight track as the error is zero on a straight track and the bot accelerates quickly.</div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div style="text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/bZeTKNKcap4/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/bZeTKNKcap4?feature=player_embedded" width="320"></iframe></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<b><u>Code:-</u></b></div>
<div class="separator" style="clear: both; text-align: justify;">
See for the bot_code.ino file here:</div>
<div class="separator" style="clear: both; text-align: justify;">
<a href="https://github.com/kartikmadhira1/Line-Follower.git" target="_blank">https://github.com/kartikmadhira1/Line-Follower.git</a></div>
<div class="separator" style="clear: both; text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: justify;">
<b>Please do subscribe to posts by mail.</b></div>
<div class="separator" style="clear: both; text-align: justify;">
<b><br /></b></div>
<div class="separator" style="clear: both; text-align: justify;">
<b>Cheers!</b></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: inherit;"><u><br /></u></span></div>
<div class="separator" style="clear: both; text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com3tag:blogger.com,1999:blog-3046429209072595972.post-34639763369466681542016-01-19T11:50:00.001-08:002016-01-19T11:50:14.116-08:00Arduino Cookbook By Jeremy Blum | Subscribe by Email to get the book!<div dir="ltr" style="text-align: left;" trbidi="on">
Here's an awesome Arduino cookbook by one of the best in the business- Jeremy Blum! The book contains lots of projects and basic interfacing like LCD's, RTC's and many more! I have been using this book for a year and a half now and it contains basics as well as harder interfacing. If you're new to Arduino or want to learn about the Arduino's peripheral interfacings, then this is the Book!<br />
<b><br /></b>
<b>To claim the book subscribe to the posts via email. To subscribe, type in your mail in the email box on the right of this page. After subscribing i'll send you the google drive link.</b></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0tag:blogger.com,1999:blog-3046429209072595972.post-88210042755103186592016-01-15T03:25:00.001-08:002016-01-15T03:25:25.316-08:00OpenCV setup on Raspberry Pi 2<div dir="ltr" style="text-align: left;" trbidi="on">
Hello! In this post, i'll be explaining on how to setup the python library for OpenCV. OpenCV is a openware library for image processing on various platforms like python. I want to mention that there's an excellent book on Raspi image processing by Ashwin Pajankar and that i highly recommend that you go through it. Here's the link to buy the book:<br />
<a href="http://www.amazon.in/Raspberry-Pi-Computer-Vision-Programming-ebook/dp/B00YHBVHO6" target="_blank">Book Link</a><br />
<br />
<div style="text-align: left;">
1. Before we start let me first tell you that this installation can take a considerable amount of time, in some cases <u><b>4-5 hours</b></u> depending on the internet speed you have. First make sure that the Raspi is connected to the internet. If you're using a LAN cable then make sure that its connected to the internet via the router or modem.<br />2.Run the following command on the terminal<br /><span style="font-family: 'Courier New', Courier, monospace;"> </span></div>
<div style="text-align: left;">
<span style="font-family: 'Courier New', Courier, monospace;"> sudo service networking restart</span></div>
<span style="font-family: 'Courier New', Courier, monospace;"> </span><br />
<span style="font-family: Times, Times New Roman, serif;">3</span><span style="font-family: 'Courier New', Courier, monospace;">.</span><span style="font-family: Times, Times New Roman, serif;">Run the following commands in sequence to update and list the raspi and the rasbian software.</span><span style="font-family: Courier New, Courier, monospace;"> apt </span><span style="font-family: Times, Times New Roman, serif;">stands for Advanced Utility Package for updating pacakges on the raspi</span><br />
<span style="font-family: Times, Times New Roman, serif;"> </span><span style="font-family: Courier New, Courier, monospace;">sudo apt-get update</span><br />
<br />
<span style="font-family: Times, Times New Roman, serif;"> </span><span style="font-family: Courier New, Courier, monospace;">sudo apt-get upgrade</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<span style="font-family: Courier New, Courier, monospace;"> sudo rpi-update</span><br />
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;">4. Now you need to install the packages in the following by typing, </span><br />
<span style="font-family: Courier New, Courier, monospace;"> sudo apt-get install <package> </span><span style="font-family: Times, Times New Roman, serif;">where </span><span style="font-family: 'Courier New', Courier, monospace;"><package> </span><span style="font-family: Times, Times New Roman, serif;">is one of the following:</span><br />
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeOnSxfiGKm_O3hmbSLIwgYShcIdunYj2XOkyMYVSVAPzok-PSrKAuU3V28yi1qYi9RVvMMv_UQinZr83AtdzRiaF9YV5BsCPqlNDMT-KGZ86QfHKyfcqqT8HKLfUmQHBhdr8laMO7AYJB/s1600/Untitled.jpg" imageanchor="1"><img border="0" height="396" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeOnSxfiGKm_O3hmbSLIwgYShcIdunYj2XOkyMYVSVAPzok-PSrKAuU3V28yi1qYi9RVvMMv_UQinZr83AtdzRiaF9YV5BsCPqlNDMT-KGZ86QfHKyfcqqT8HKLfUmQHBhdr8laMO7AYJB/s400/Untitled.jpg" width="400" /></a></span><br />
<br />
<br />
<span style="font-family: Times, Times New Roman, serif;">Now, that you've completed the setup your Raspi is ready for the Image processing. Maybe next week i'll come up with the final Ball tracker setup.</span><br />
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;">Cheers!</span><br />
<br /></div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com3tag:blogger.com,1999:blog-3046429209072595972.post-19853970845727567402016-01-15T01:32:00.001-08:002016-01-25T01:36:06.785-08:00Simple Ball Tracker Bot | Part 1|Hardware and Software requirements<div dir="ltr" style="text-align: left;" trbidi="on">
<div style="text-align: justify;">
Hello readers. This is going to be my first blog post in three years and finally I've got enough material on projects I've worked on, will be working on, and what not but have got a considerable lot in my bag to turn them into blog posts. Hopefully I'll post updates as soon as possible.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Coming straight to the topic, The ball tracking bot is a simple image processing based bot that tracks a ball of a certain color(can be changed by changing the HSV values embedded with a tilt camera(with a servo) and follows the ball. It is going to be based on OpenCV. I chose this as my first topic as i wanted to start the blog with relatively easier topics than that would soon follow up. Image processing is a wide area and I've just started to learn image processing and only recently came up with an idea of such a bot (Obviously from getting inspired from similar bots on YouTube). So lets gets straight into it and look into the hardware and software requirements:</div>
<div style="text-align: justify;">
<br /></div>
<br />
I. <b><u>Hardware :</u></b><br />
<b><u><br /></u></b>
1) <b>Raspberry Pi 2 B+</b><br />
<br />
The reason for this being in the system is obvious, that is, that it'll be handling the image processing part of the proposed system. But i chose Raspi over other PC or computer based processing was that it'll make the bot mobile, with the capability of online editing of codes while running the bot. We're going to use the OpenCV library for python and we'll need python 2.7 or later versions. I'm not well versed with python, im still learning it. The OS i've installed is the Rasbian,<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjssktEaFCz2-kRw3BaDILeVN-p3Y805fsHSmmgv5NZ1bAMaBRRMRQ7XClXzukv-bl2v_nLWmtVvYYapiiLE0oRr2WxxdkHJKIS4aI-Fp7IuQGshfnvhUO239Vl1vxn_M0fD5CcSYVgCw3g/s1600/download.jpeg" imageanchor="1"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjssktEaFCz2-kRw3BaDILeVN-p3Y805fsHSmmgv5NZ1bAMaBRRMRQ7XClXzukv-bl2v_nLWmtVvYYapiiLE0oRr2WxxdkHJKIS4aI-Fp7IuQGshfnvhUO239Vl1vxn_M0fD5CcSYVgCw3g/s400/download.jpeg" /></a><br />
<br />
<div style="text-align: justify;">
2) <b>Slave Microcontroller</b> ( Arduino or any microcontroller that you are well versed with)</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
We'll be making decisions on the raspi for the tracking the ball but we'll send these decisions to execute to slave a microcontroller to move the servo and drive the motors for the chassis.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
3) <b>9 gm Servo</b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Preferably a 9gm servo, i'm going to use as i think this is going to be enough to move a small USB camera. One thing important to note is that Servos require some amount of peak current and generally that amount for a 9 gm servo would be 200-300 mA so a compatible battery needs to be employed. Additionally i have put a pan tilt mechanism and mounted the camera on top and servo on the side.</div>
<div style="text-align: center;">
<div style="text-align: justify;">
</div>
</div>
<div style="text-align: justify;">
4)<b>USB Webcam/Camera.</b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
A USB Webcam is i guess enough for the simple purpose of just tracking the ball, though if complex or higher end image processing is to be required then you need to use the Raspi Cam. The reason being that USB communication eats enough of the RAM/Processing Power but on the other hand the Raspi cam is directly connected to the dedicated GPU which makes the system available to compute the other tasks. Have a look at this:</div>
<div style="text-align: justify;">
<a href="https://www.raspberrypi.org/forums/viewtopic.php?t=85899&p=607023" target="_blank"><b>Raspi Forum</b></a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
5)<b>DC Motors and l298D Driver</b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Normal 200 rpm dc motors i am going to use and the driver for it is going to be the l298d.<br />
<br />
<b>Note:- l293d will not be sufficient enough for this type of bot as explained <a href="http://electronics.stackexchange.com/questions/107162/l293d-problem-cant-run-both-motors-together" target="_blank">here</a></b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
6)<b>Power supply </b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
For the raspi i'll be putting a seperate 1000 mAh battery pack and for the the uC, Servo and motor, i'll be using 1000 mAh Li-Po battery.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<b>IMPORTANT: </b>If you're going to use a uC other than Arduino, its better to use a USB to serial converter module due to different logic levels in raspi and uC pins. Like this one:</div>
<div style="text-align: justify;">
<br />
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYIQoePX5dxTaIPs-Rxp9Pn2T6n2PXTNAEGrTsDqN_s6j__1qRkwJTTEwSdWIed_ZIwR7eqb2fW9C_4pOf4bRfIhfet6YSnF2M8IhETz0OAZqEjMAHhyphenhyphen54bYhqmnXZaltupyHe_zx-wKhW/s1600/IMG_20160116_184339_HDR.jpg" imageanchor="1"><img border="0" height="233" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYIQoePX5dxTaIPs-Rxp9Pn2T6n2PXTNAEGrTsDqN_s6j__1qRkwJTTEwSdWIed_ZIwR7eqb2fW9C_4pOf4bRfIhfet6YSnF2M8IhETz0OAZqEjMAHhyphenhyphen54bYhqmnXZaltupyHe_zx-wKhW/s320/IMG_20160116_184339_HDR.jpg" width="320" /></a></div>
</div>
<div style="text-align: justify;">
<br />
II.<b><u>Software:</u></b></div>
<div style="text-align: justify;">
<b><u><br /></u></b>
</div>
<div style="text-align: justify;">
1) <b>Python 2.7 on Raspi </b></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
2)<b>OpenCV library for python</b></div>
<br />
<div style="text-align: justify;">
<span style="font-weight: bold;"><br /></span></div>
<div style="text-align: justify;">
<b>optional :Arduino IDE on Raspi</b><br />
<div style="text-align: center;">
<b><br /></b></div>
<div style="text-align: center;">
<b><br /></b></div>
</div>
<div style="text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhd5yb8B-4Go-dLM87Q1931uNq-Movqra4qnXZnTRmTtgdw1g8BeFxgG5CSJscQ8wIEmSWP7UT-43jgYL9KZaCzUfEYz0b1QyHpMNYG3w3D02zRzgiykylIWTvteVgIp-62poi9ygK8kuMH/s1600/IMG_20160116_211015_HDR+%25282%2529.jpg" imageanchor="1"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhd5yb8B-4Go-dLM87Q1931uNq-Movqra4qnXZnTRmTtgdw1g8BeFxgG5CSJscQ8wIEmSWP7UT-43jgYL9KZaCzUfEYz0b1QyHpMNYG3w3D02zRzgiykylIWTvteVgIp-62poi9ygK8kuMH/s320/IMG_20160116_211015_HDR+%25282%2529.jpg" width="318" /></a></div>
<div style="text-align: justify;">
<div style="text-align: center;">
</div>
<div style="text-align: center;">
<i>Bot Chassis (Haven't made any connections yet)</i></div>
<div style="text-align: center;">
<br /></div>
</div>
<div style="text-align: justify;">
Just get to the terminal window and type the following:</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<span style="font-family: "courier new" , "courier" , monospace;">sudo apt-get install arduino</span></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
I'll be coming up with another blog post on the how to setup the Raspi for OpenCV.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Cheers!</div>
<div style="text-align: justify;">
<b><br /></b>
<br />
<b>Subscribe my blog to keep a track on future posts!</b></div>
</div>
Kartik Madhirahttp://www.blogger.com/profile/08788094853068714583noreply@blogger.com0