Sunday, March 5, 2017

ROS Autonomous Navigation Bot - [Clerkbot] - Initial Tests

Finally it took me three months to fully come up with this robot and just a fun fact, it took me a month to just tune the ocean of parameters.  Here I say two months to build  the robot, although a good 3 months to 'learn' the ROS framework which includes a Gazebo simulation of a UAV and UGV.  Now that it is over,  we are all geared up to the challenge of UAV Autonomous navigation.

This whole robot setup is part of a year long research project on UAV and UGV platforms under Dr. D.K. Kothari, HOD, EC Department, Nirma University. The UGV setup is planned to be a precursor to the UAV setup as we wanted to get on 'hands' with ROS framework. Getting our hands directly on a UAV can be a daunting task, and UGV was commissioned to fulfil that exact need.  The UGV is working perfectly. But if you want to know as to why exactly it is named so, you'll need to wait.

This is going to be a tutorial cum documentation posts for the Clerkbot. In the coming videos and posts I would be dealing with all the details of the robot.  I would also soon establish an open source platform for the same setup so that others can benefit from the same.

Here's the Initial tests video of the UGV:


Thursday, January 5, 2017

STM32 Cortex M3 Series - [STM32F103C8T6] - #2 - Encoder Interface

Before I begin kindly keep my GitHub code as reference throughout the tutorial. I'm currently working on encoder interface to generate odometry data for my Autonomous UGV and UAV project. This can be done in various ways by encoders of various types. The one I'm having is this 300RPM motor with a quadrature encoder circuitry embedded to it's shaft. Here's the motor I'm talking about from Robokits:


Figure 1. The Quadrature Encoder Motor

The encoder circuitry has +5, and GND as logic inputs and two channel outputs A and B giving timely square outputs.
(NOTE: Apparently I spent around two full days for the lack of a datasheet on the website trying to figure out why the outputs aren't coming only to realise later that the pins need to be pulled high with a 10k resistor. )
What seems so easy to do on paper is to simply get the interrupts from both the channel for a higher resolution and provide conditional loops in the interrupt routines to get the values and send the values back to main loop. 
WRONG. Never do this. This is bad coding. The prominent reasons for this are:

  1. The encoder readings change very fast. There is a possibility that it just might skip some values give an error reading(channel_A_state=channel_B_state=0 or 1) and count can be a wrong one or an illegal one.
  2. Checking for a high or low value in an Interrupt Service Routine is a bad choice too. The ISR's should be as small as possible. This is because, the core has been interrupted and you need to get back to look into other tasks too.

What I would like you to point towards is this stackexchange post answer which points out to the use of using decisions in getting values of the interrupt values:
You should have ZERO ifs. No decisions. Store your AB state, i.e., 00 or 01, then append your next state, i.e 0001 means AB went from 00 to 01,thus B changed from 0 to 1. Make this a +1. If starting from 00, and you change to 10, then call this a -1. Build a 16 element array of all possible transitions holding the number that needs to be added to your count if it occurs, noting that some are illegal and need to be handled.
The counting is essentially on the works of the graycode.  So we are going to create a graycode and check it against an array of legal and illegal states. Here's the algorithm:

  1. Create an array of the possible states in the graycode with something like this:

    int8_t states[] = {0,-1,1,0,1,0,0,-1,-1,0,0,1,0,1,-1,0}
  2. Also create a graycode containing the (final_state= prev_AB_state+current_AB_state). Left shift the previous AB state by 2 and append the current AB state to the final state.(If this is confusing to you, see the full code on my git.)
  3. Here's how a graycode looks:



So if for e.g ,
  • The final_state is 0001, it means the prev_AB_state=00 and it has changed to 01. This would be given a legal state of -1, or anti-clockwise rotation
  • The final_state is 0010, it means the prev_AB_state=00 and it has changed to 10. This would be given a legal state of 1, or clockwise rotation.
  • Now if both channels A and B are changing then we term is as an illegal state as both cannot change the state together.
I would soon post a video on the same and my next post would be on setting up OpenOCD debugging and Linux Eclipse CDT setup.

Cheers!


Wednesday, December 14, 2016

The Low Cost Automatic 'Wall-Painter' Robot- A Sneak Peak!

After nearly 3 months of working and designing the 'Wall-Painter' Idea(I proposed my Idea to the Idea Labs Committee, Nirma University about 3 months ago) along with team members Dishant Kavathia and Ammar Gandhi. The objective of the Wall-Painter is simple - A dedicated app will provide templates of paint designs as an input to the Wall-Painter, and it will oblige to do the required. I had and been doing projects since the last two years, but there was always a small amount of lack of professional 'furnishing' and 'designing.' Hence, the goal this time was to give ample time in designing and refining the project at hand.

The Idea required a manifestation of the Idea in my mind to a proper 3D Model first. Dishant Kavathia, a fellow teammate, has been super-amazing in making the model come right on SolidWorks and also rectify the underlying mechanical difficulties. Ammar and I would be looking at the electronic installation and the algorithms to drive the tasks. Also, I can't indulge in any more details as this is under development.

Here I am attaching a few snapshots of the proposed plan as designed by us (Model made by Dishant Kavathia)


 Figure 1. The skeleton model


Figure 2. The basic X-Y frame

STM32 Cortex M3 Series - [STM32F103C8T6] - #1 - LED Blinking

So finally after three months, here I am with one of the first posts on the ARM STM32 Cortex M3 series after working for about 3-4 months from now.  The 8-bit controllers seem somewhat ugly in comparison to the powerful 32-bit ARM architecture, boasting a hell lot of peripherals- DMA's, OTG, Ethernet and hundreds of GPIO's(Obviously, I've still not touched them and in my learning process). This becomes a different ball game as the application for which you might be needing this for, won't require this much of computational prowess. Starting with the very 'Hello World' of every hardware project which is the LED blinking:

The IDE I've been using is the CooCox IDE, a very decent IDE along with the arm toolchain for arm programming with the good amount of online community for sample codes. There are two ways that we can program using this IDE:
  1. Using the existing well versed and documented libraries of the stmf103xx  series, which I've used currently and this blog has been helpful to me regarding the programming of the series.
  2. Secondly, one could use the low, register level programming using just the stm basic library and well documented in the YouTube tutorials by Patrick Hood Daniel.
The final goal is to light up an LED so we'll need to setup the GPIO's or the port for the LED blinking. Before we begin, it would be easier if you would have the reference manual (and not the datasheet) in the background.

  1. The Cortex M3 core is connected to different peripherals via different buses. These buses are prominent parts of arm architecture known as the AHB, or the Advanced High-performance Bus.  The AHB is then connected to different APBs,  which is the Advanced Peripheral Bus(APB). From the datasheet information given below:

    figure 1. Architecture diagram, AHB and APB2 buses

    We can see that the GPIO's are driven by the clock from the APB2 bus, clocking at a maximum of 72Mhz. So our, first task will be to enable the clock source for these GPIO's. 
  2. Next,  as we are going to use the GPIO's. We need to specify the parameters for the port or pin to work - input or output, type of input or output etc. as with the case with every microcontroller. 
    The following are the types of Input and output types available on the GPIO's as given in the reference manual(Pg.158, section 9.1)
Figure 2. Input and output pin configurations

Figure 3. Basic input and output block Structure of the GPIO's 

Here's a brief on the types of these pins:
  • Input floating : In these pins the singals are in a 'Floating' state or tri-state. Meaning the signals are of no use unless they are pulled up or down by a high source or the output is in some definite state.
  • Input pull-up : In these type of pins, the input state when nothing is connected to it is High since a resistor is 'pulled-up' internally.
  • Input pull-down : Same as above put pin is pulled down with a high value resistor when nothing is connected and it is internal in the structure.
  • Output Open-drain : The output transistor is open drain and a resistor with a voltage level needs to be tied up to the drain .
  • Output push pull : The output is open drain is only one direction meaning, when pin has to go high it has to rely on the resistor and the pin by itself cannot source. Therefor it has two transistors both to sink and to source the current. Here is a good text from this site:
    "The push-pull output actually uses two transistors. Each will be on to drive the output to the appropriate level: the top transistor will be on when the output has to be driven high and the bottom transistor will turn on when the output has to go low.
    The advantage of the push-pull output is the higher speed, because the line is driven both ways. With the pullup the line can only rise as fast as the RC time constant allows. The R is the pullup, the C is the parasitic capacitance, including the pin capacitance and the board capacitance.
    The push-pull can typically source more current. With the open-drain the current is limited by the R and R cannot be made very small, because the lower transistor has to sink that current when the output is low; that means higher power consumption. "
I would continue with the coding, in the next part as this would become a very long post.
Thanks for watching and Cheers!



Thursday, August 25, 2016

RPLiDAR + ROS + Hectorslam Setup

So, I have been lately working on a very interesting but difficult project with team UAV Nirma , 'UAV navigation in Indoors' and the timeline is till August 2017. 

The first thing you might be asking is why such a costly and complicated setup?
The thing is, it is not as complicated as you think. The specific algorithm through which we want to devise is the SLAM algorithm.

What SLAM does is that it maps the environment, extracts the landmarks and gets to know where it is using these landmarks. There is a very good documentation available here.

Since the quad has to know where it is with respect to the landmark, we have to use a lidar and to use the lidar - ROS.

The lidar we're using is the RPLIDAR from RoboPeak. 


For the ones who are looking onto how to interface the RPlidar with ROS, here's a very good tutorial.

Here's a video of the hectorslamming:


Monday, August 1, 2016

People Counter using OpenCV and Python

People counter system is counting system based on Image processing techniques to count human traffic at various places like Retail shops, Malls, Public places etc. The module was made using an RTSP stream from an overhead CCTV camera and processed through OpenCV library for Python. This Module was made at my internship.


 



fig.1 Snap of the people counter GUI



 The live video feed is taken frame by frame to process and morphological operations were applied to each frame to get rid of noises and get proper blobs. Next, using contours method we extract the co-ordinates of each blob and track them so that the blob can be counted in a region of frame. The accuracy of the counting depends on a number of factor such as height of camera, field of vision, camera angle, lighting etc..

Take a look at the video:





Sunday, June 26, 2016

Glance time/Attention time span using OpenCV and python

This is first among  the many modules which I have made with my team during  at TetherBox Technologies- An awesome IoT startup in heart of Bangalore!
So Myself, along with my colleague here have been building the people counter system since April 2016 and before I go into it later next month, I thought of posting about the glance time or the face attention span module I made here. It is to be deployed at major companies who want to get insights of customers looking into their products, albeit their attention span to their most important products at display.

I have just used haar cascade frontal face classifier and the python time module. 

Here's the video: