Install GCC with C++14 support on Ubuntu/Mint

The current GCC in the ubuntu repository doesn’t support the C++14 standard. To use the C++14 install the GCC has to be updated manually. It can be found in the Ubuntu Toolchain PPA. After this, the C++ compiler can be updated. The following commands show how to add the repository and install the compiler:

After the installation, the compiler can be used by this command:

The default compiler will not be updated. You can see this, because the g++ symlink points to the old g++ version:

To set g++4.9 as default compiler, the symlink in has to be updated:

To check if you are now using the right compiler you can print the gcc version. It should be something like 4.9.X.

That’s all. Source code can now be comiled like this:

Posted in linux | 4 Comments

TSL2561 light sensor with STM32F4

The TSL2561 is a sensor that measures the light-intensity and transforms it to a 16-bit resolution digital value. There are two photodiodes on the sensor, one to sense the light intensity plus infrared light and a second one to measure only infrared light. These two photodiodes convert the measured light-intensity simultaneously. There are two channels available on the sensor to transfer the values into the data register (channel 0 and channel 1). The idea is to subtract the infrared part of the light, so that the resulting light spectrum is the human seeable light.

The measured data can be received via I2C or SMBus. In this article the I2C bus will be for the communicatation between the STM32F4 and the with the TSL2561.

Connect the device

The TSL2561 works with 3.8V, don’t use the 5V pin, you may destroy the sensor. Futher there are two pins (SDA and SCL) for the I2C bus communication. The SDA is the data line, while SCL is the clock. The address pin is to set a unique I2C bus address. The default address can be used, unless there is another device with the same address on the bus. The default address is 0x39, but it can also be set to 0x29 or 0x49. The sensor can generate an interrupt, so that it is not necessary to poll the device. In this example the interrupt signal is not used.

TSL2561 light sensor

The software part

The STM32F4 controller has several I2C interfaces. The example implementation uses the I2C1, which is connected to GPIO6 (SCL) and GPIO7 (SDA). The configuration is encapsulated to an configuration header, so it is easy to change the bus. The software can be downloaded from the repository (the project is called TSL2561_example). The maximum clock frequency for the device is 400kHz, so this frequency is set in the configuration file. The init_lightsensor(void) method is the configures the necessary IOs and the I2C communication via initLightSensorI2C(void).

To read the measured values, there are four register (two for every channel), that can be read. First there is a 8 bit register that contains the values low byte an then there is a second register for the high byte.

Beschreibung Adresse
0x00 control register
0x8C low byte (channel0)
0x8D high byte (channel0)
0x8E low byte (channel1)
0x8F high byte (channel1)

To read bytes from the device registers, the requested register addresses should be send bytewise to the control register (0x00). There is a command bit that must be set so that the device reacts on the request.

The received data can be calculated to the SI unit LUX. This calculation is already implemented in the example code. The read_lightness_value() method receives the data bytes via I2C and processes the calculation to LUX.

Appendix

Posted in ARM, embedded systems, English, Sensors, STM32F4 | Tagged , , , , , | 1 Comment

STM32F4 with ERIKA Enterprise RTOS OSEK

Erika Enterprise is an open source OSEK compliant real time operating system (RTOS) that support the STM32F4-Discovery controller. There is a eclipse tool chain integration available, so it is possible to develop software directly in this IDE. Combined with the GNU ARM Eclipse Plugin, the stlink debugger and programmer and the GNU Tools for ARM Embedded Processors, eclipse is a great tool for developing, flashing and in-circuit-debugging applications based on the ERIKA real time kernel for STM32F4 devices. The OS can be downloaded here, if there are any questions about the installation, please have a look at the wiki or the forum on the ERIKA website.
The OS comes with an oil-file, which is a configuration file for the kernel. In this file, there are several options to config the kernel and to define dependencies for used libraries, e.g. the Cortex Microcontroller Software Interface Standard (CMSIS).
The ERIKA kernel supports several classes of tasks (basic and extended tasks). Compared to the basic tasks, the extended tasks support synchronization via system events. This nice feature allows to activate a task by sending an event within the software. To learn more about the operating system, please have a look at the official documentation.
The following section describes the usage of ERIKA Enterprise with extended tasks and event based scheduling on the STM32F4 controller. To use this kernel configuration, there are some necessary options that can be set in the oil-file. A configuration for the controller looks like this:

The STM32F4-Discovery has an ARM Cortex-M4 cpu which works with a frequency of 168 MHz, so the CPU_CLOCK parameter is set to an appropriate value. The cpu clock depends on the exact model, in this example I used the STM32F407VG. The APP_SRC parameter is used to list every *.c file that should be compiled with the kernel. To use multiple files, the parameter can be repeated. The development is based on the GNU ARM compiler and debugger, so the COMPILER_TYPE is used to declare the GNU toolchain. To use extended tasks, it is necessary to specify a MULTI_STACK and with a static size. The last block defines the CPU model. It is set to the STM32F4 processor family.

To integrate external libraries, like the  ARM specific CMSIS, there is a possibility to add some libraries to the kernel. The following configuration shows how to defines the usage of the additional libraries:

These parameter are processor specific for the STM32F4 controller. The next configuration part describes the options to configure the ERIKA kernel scheduling. The kernel supports four conformance classes, that can be set (BCC1, BCC2, ECC1, ECC2). The BCC classes support basic tasks during the ECC classes support extended tasks. Extended tasks are necessary for event driven scheduling. To read more about the conformance classes, please have a look at the ERIKA reference manual. The  conformance class can be set with the following parameter:

The next step is to configure events and tasks. In this example there will be one task that is triggered by one event. This task will be called setPwmTask and it reacts on a pwmEvent. Tasks can react on a set of events, these have to be assigned in the configuration file. Accordingly a configuration for one task that reacts on one event looks like this:

The next step is to declare them in the implementation, so that it could be activated by the software:

The last step is the initialization of the kernel. After this, the task can be activated by setting the pwmEvent event. The following snippet shows how to initialize the kernel and then activate the setPwmTask periodically:

With ERIKA Enterprise comes a very nice open source real-time OS that can easily be integrated in the eclipse IDE. It runs on the STM32F4-Discovery controller, so that all the nice features of a RTOS are available.

Posted in embedded systems, microcontroller, STM32F4 | Tagged , , , , , , , | 2 Comments

SRF02 ultrasonic sensor with STM32F4-Discovery

The SRF02 Ultrasonic range finder is a ultrasonic distance sensor, which is a transceiver with only one membrane. That’s practical, because the sensor is very small and so it is perfect for small sized hardware applications. It detects distances from 16 to 6000 cm. The Sensors needs a 5 V source, so it can be directly connected to many microcontroller. It communicates through serial interface or through i²c bus (also I2C bus called). The following article describes how to connect and use the sensor with a STM32F4 through I2C. The databus is very handy because it needs only two wires (SDA and SCL) for communication. SDA is the data wire and SCL is the clock. Components can be distinguished by a unique I2c address. In this example there are four SRF02 sensors on one bus. The configuration of the address will be described too. The I2C master is a ARM based STM32F4-Discovery (STM32F407VG) controller.

SRF02 front view

SRF02 back view

SRF02 side view

SRF02 Basics

The SRF02 measures distances with ultrasonic signals. Therefore the sensor sends a short echo impulse through the membrane and afterwards it detects the reflection of the signal. The sensor also measures the time. The speed of sonic is constant, so the distance can be calculated with the time between sending the signal and detecting the reflection. This procedure is called time-of-flight.

The SRF02 must not be calibrated, this was already done by the manufacturer. The sensor provides three different units which can be configured. The following table shows the configurations for the different configuration:

 

address function
0x50 measured data in inch
0x51 measured data in cm
0x52 measured data in microseconds
0x56 fake range mode (inch)
0x57 fake range mode (cm)
0x58 fake range mode (microseconds)
0xA0 1. sequence to set the I2C address
0xA5 2. sequence to set the I2C address
0xAA 3. sequence to set the I2C address

Setting an unique I2C address

The addresses of the sensors are transferred as hexadecimal bytes. Initially, every sensor has the same address 0xE0. If there is only one sensor on the bus, there is no need to change the address, but if there are a few of these sensors, the address has to be changed to an unique one. The new address has to be transferred to the command register of the sensor (0x00). To change the address, the sensor has to get into a mode, where it receives and saves the new address. So, there are three control bytes (1. – 3 sequence) which has to be transmitted. The new address will stay saved on the sensor, even if it’s switched off.

The sensor shows it’s address while starting. There is a LED, which flashes the saved address with short impulses. When the sensor is switched on, is flashes the LED once (long impulse). After this, the sensor flashes the address as short impulses. For example, with the address 0xE0 the sensor will not give short impulses, but with the address 0xF0 it will flash eight times. The mapping of the addresses and the signals can be found in the following table:

address flashes
0xE0 0
0xE2 1
0xE4 2
0xE6 3
0xE8 4
0xEA 5
0xEC 6
0xEE 7
0xF0 8
0xF2 9
0xF4 10
0xF6 11
0xF8 12
0xFA 13
0xFC 14
0xFE 15

In the repository (project: SRF02_example) there is an implementation to change the sensor address. Because of the fact that the all of the SRF02 sensors have the same initial address, it is necessary to change the address successively. The address change can be done with the function setSensorI2CAddress(). The function requires only the current and the new I2C address. An example implementation looks like this:

Connect the sensor with STM32F4

The example code in the repository shows how to configure and connect the sensor with the STM32F4 controller. It can be found in the directory SRF02_example. The project uses four sensors in total. First the I2C bus is initialized with the function initSRF02(). After this, every sensor gets initialized by a call of initUltrasonicSensorI2C() which expect the sensor’s I2C address. This function sets the measured unit additionally. Then the distance can be measured by the readDistance() function. The following code snippet is an example implementation to read the distance by one sensor:

Appendix

supplier: exp-tech

Posted in ARM, Eingebettete Systeme, embedded systems, English, microcontroller, Sensors, STM32F4 | 2 Comments

Xbox360 controller C integration

Today I have written some simple example code to integrate a Xbox360 controller into a C-based program. It reads the values of the controller axis and the buttons and displays them on the screen. The following picture shows the output of the code:

XBox360 controller signals

In this case the buttons “A” and “TL”, which is the top left button, were pressed. The project is called xboxControllerClient can be downloaded from my repository at https://code.google.com/p/scholtyssek-blogspot/.

The whole configuration is in the header file xboxController.h. There the mapping of the buttons and the axis is implemented. The data will be stored in a struct called xboxCtrl. It is also is in the header file . The following example shows how to use the code:

The method initXboxContoller(XBOX_DEVICE) opens a connection to the device /dev/input/js0. This device is set as the default device. To change this, change the value of XBOX_DEVICE. Next, the method getXboxDataStruct() returns a pointer to the data stuct xboxCtrl. In this struct, all informations are strored. You can update the informations with a periodic call of  readXboxData(xbox). To print the state of the XBox controller, you should call printXboxCtrlValues(xbox)That’s it, happy coding 🙂

Posted in Eingebettete Systeme, embedded systems, English, linux, xbox360 | Tagged , , , , , | Leave a comment

Rigol DS1052D and Open Logic Sniffer

Today I have tested the logic analyzer of the Rigol DS1052D oscilloscope and I noticed that the LA does not interprete the measured signals. The Rigol DS1052D is a lower priced model, so it is ok that the data is not interpreted by the oscilloscope. But it is possible to export the signals and process them on a PC with the open source tool Open Logic Sniffer. To do so you can export them as a CSV datafile. After an easy converting the data can be read by OLS. The following step describe what you have to do.

Rigol DS1052D oscilloscope

 1. Save and export the samples

With the “RUN/STOP” button it is possible to freeze the current measurement. Then the samples can be saved on a USB device. For this, you can push the “Storage” button and select the menu entry “External”. This entry will only be enabled if a USB device is connected. Now you should select the CSV format and enter a filename (e.g. “samples.csv”). After that you can save the file.

2. convert data

The CSV file can be converted with a simple command:

This command creates a new file “samples.ols”, which has to be converted. For this, you should open the file and remove the first two lines. Afterwards, you should add the parameter Rate and Channels as first two lines. Rate is the bandwidth of the oscilloscope (here 50MHz) and Channels defines the count of used channels (1-16). The file should now look like this:

3. Import data in OLS

Now the file “samples.ols” can be imported in Open Logic Sniffer. Open the file with the menu entry “File -> Open”. If everything was correct you should see you data:

Imported scope data

Posted in linux | Tagged , , , | Leave a comment

STM32F4 controlled omnidirectional mecanum robot with odometry

In the last months I worked on a new project based on an ARM STM32F4 controller. The goal was to implement a software to control a robot with mecanum wheels (also called swedish wheels). These wheels are very special, because there are rubber rollers arranged at 45 degree on the outer rim that roll passively on the ground. Thus the robot has a further degree of freedom. This means all directions (X, Y, Θ) can be reached by the robot on a plane. Despite the lack of steering axle it has the maximum freedom of movement. X and Y corresponds to the movement in the respective directions, and Θ represents the rotational movement of the robot.

the robot

In the robot is a Nexus 4WD. It has four Faulhaber motors which are directly equipped with incremental encoders furthermore it also has (ultrasonic) distance sensors and an Arduino controller with suitable motor drivers. The 12 V DC motors are powered by a battery that is connected with a standard Tamiya plug.
Since I have decided to use the STM32F4, I performed some modifications on the robot. In particular, the Arduino controller was replaced by the STM32F4, but this had the consequence that the motor drivers that were permanently installed on the board of Arduino, were also removed from the robot. Therefore, a new engine controller with appropriate drivers (h-bridge) had to be developed so that the motors can be controlled properly.  In addition to the existing sensors and actuators, the robot is equipped with a gyroscope so that the orientation, that is, the rotation movements of the robot, can be measured in the context of odometry.

Figure1 and figure2 are showing the robot with a open chassis. They also show the four Mecanum wheels which are fixed to the shafts of the motors. In addition, you can see – despite the bunch of wires – the hardware components and the battery. Figure2 shows the robot in a top view and figure3 shows the individually hardware highlighted.

 

Figure1: Nexus 4WD (back view)

Figure2: Nexus 4WD (top view)

Figure3: Nexus 4WD with highlighted hardware

 

The movement of the robot is omnidirectional and it depends on how the wheels are arranged and how they rotate. The mecanum wheels are so arranged that the rollers are arranged into the center of the vehicle (in perspective from above). Thus, the movement direction is defined by wheel movement. Figure4 shows three scenarios to illustrate how the directions of rotation of the wheels affect the movement of the robot. So you can see in example (a) that all the wheels rotate forward. The resulting movement of the robot is thus a forward movement. In example (b), the wheels turn in different directions. These on the left side of the robot rotate inwards while the wheels on the right side rotate outwards. The resulting movement is in the Y direction on a plane. In the third example the wheels on the left side rotate backwards while the wheels on the right side rotate forwards. This results in a counterclockwise rotation of the robot.

Figure 4: robot motion depending on the rotational direction of the mecanum wheels

motor control

With the Arduino, the motor driver were also removed because they were mounted on the controller. Therefore, a board has been developed with two Toshiba TB6612FNG motor driver. These driver can handle two motors, so that two pieces were used in total. The drivers are a little undersized, because the engines are given with a maximum current of 1.4A. Since the motor driver only withstand a maximum current of 1.2 A, the engine performance is limited to a maximum of 80% in the software. To control the driver (through the STM32F4) a PWM signal is generated by a timer. This signal causes the driver to create a corresponding voltage to the motors. So the timer generated PWM signal influences the power of the motors.

incremental encoder

Incremental encoder are sensors for measuring rotations. With the use of a timer, is possible to measure the angle velocity of an rotation. The angle can be calculated by integration over the time. Because of the fix radius of the wheels, the driven distance can be determined by multiplication of the angle and the radius.
The Nexus robot has four photoelectric incremental encoder, which generate 12 pulses per rotation. Photoelectrical encoder emit a light pulse through a rotating disc, which has a couple of slits. Thus, a periodic Signal is generated, which allows to conclude the velocity and the direction of a wheel. This signal is getting interpreted by the STM32F4 controller and processed by the software. The rotations of the motors are transmitted by a gear with a ratio of 64:1. So the count of the pulses are increased to 768 pulses per rotation. The STM32F4 controller has quadrature encoders, so the steps per rotation are again increased by the factor four to 3072 in total. Thus, the resolution of the incremental encoder is approximately 0.117°.

The following example shows the initialization of an encoder as a quadrature encoder:

Afterwards, the counted pulses can be read by the timer register:

gyroscope

The gyroscope is used to measure the relative angular change of an rotation. The robot was equipped with an GyroClick module for MikroEletronika, which has an L3GD20 rate sensor (see figure5). The robot can only move on the ground, so it is sufficient to measure the yaw rate. So the rotations along the Z-axis will be used to calculate the orientation of the robot. The gyroscope measures the angle velocities, the relative angle can be calculated by integration over the time. For the communication with the STM32F4 controller, a SPI bus is used. The gyroscope works with a sample rate of 760Hz.

Figure5: GyroClick rate sensor

The gyroscope drifts very much, so it was necessary to calibrate it. Therefore were two calibration steps accomplished during the initialization phase. First, the bias was measured by a couple of reference measurements in zero position. The bias was calculated by the average over the measurements. Figure6 shows an uncalibrated measurement in zero position. It is apparent that the measured value is shifted by an offset of -56. This corresponds to approximate -0.49°/s by and resolution of 8.75°/s.

Figure6: Measurement in zero position for calculating the bias

The second step is used to prevent against influence by noise. In the software a threshold defines the minimum angular velocity to detect an angle change. So it is not possible to measure very small angle changes, but the drift is reduced really good, so that this could be approved. To get the relative angle of an rotation, the angular velocity has to be integrated over the time. In the software the integration is implemented with the trapezoidal rule. The following example shows the offset calculation with the average over 2000 measurements:

The next code snipped shows the calculation of the threshold for the noise:

With there values, is is possible to get the relative angle by integration with the trapezoidal rule. The result is mapped to the interval [0°,360°]:

odometry

Odometry is a method to calculate the position of ground based robots by considering the rotation of the wheels (or the steps of  humanoid robots). Many robots are able to measure the wheel rotation anyway, so it very popular to use these information for localization of the robot. The covered distance of an rotation can be calculated with the covered angle and the radius of the wheel. So the complete way of the robot can be traced. The sampling rate is 5ms, so there is much (odometry-) data will be produced. This data is stored to an microSD card with the use of an DMA-Controller and FatFs library by SPI. The odometry data is available for later evaluation. Figure7 shows the traced data of an test drive. This route had a distance of 6280mm and the robot was localized with every 5ms. The robot has an offset while straight ahead driving. This is, because there is no speed regulation for the motors implemented at the moment.

Figure7: Traced odometry data

Appendix

Posted in embedded systems, microcontroller, Sensors, STM32F4 | Tagged , , , , , , , , , , | 4 Comments

Yakindu Statechart Tools Arduino Integration

The Yakindu Statechart Tools are predestined to describe behaviour for a system and afterwards generate code. The output code could be Java, C or C++. We would like to use the code generator in the context of embedded systems and so we decided to generate code for an Arduino Uno. Therefor we modified our statechart TrafficLight example and implemented some simple glue code to map the hardware timer and the I/O-ports. Furthermore it was necessary to built a hardware controller that represents the traffic light and so we created a simple controller (it’s available as eagle plan in the appendix and could easily be reconstructed). The example is directly implemented for the AVR ATMEGA328P-PU processor (this one is on the Arduino Uno). It is possible to use the Arduino library to integrate this system, but we would like to support the whole AVR family and so it was the better way to use the AVR library directly. With this solution the Yakindu SCT could be used on many AVR based processors.

To explore the example project, you have to go through some steps:

1. Environment

  • Download the eclipse IDE with Yakindu SCT from our download section, or install the Yakindu SCT by using the Eclipse update manager (see: http://statecharts.org/download.html).

  • On Linux, install the AVR  environment:
    sudo apt-get install avrdude binutils-avr gcc-avr avr-libc gdb-avr.
  • Get the Arduino software from http://arduino.cc/en/Main/Software and build the necessary libArduinoCore.a. See http://playground.arduino.cc/Code/Eclipse#Arduino_core_library for detailed description. Now set the include path to the Arduino specific header files and also to the libArduinoCore.a. The libArduinoCore.a should be stored in a folder called “lib” in the project directory. The include path should look like this:
  • Open Eclipse and import the ArduinoTrafficLight example project from our Git repository (https://github.com/Yakindu/statecharts/tree/master/). You can find this project in thearchive folder. After importing the project, you have to check if the include-path is set correct for your system.
  • Build the TrafficLight hardware controller (see circuit layout in the appendix) and connect the ports to the Arduino Uno.
  • As next you have to generate the C++ code by using the TrafficLight.sgen (right click on the file and then generate artifacts). Currently there is an open issue and you have to fix it manually. Open the src-gen/sc_types.h file and change the datatype sc_integer from int32_t to uint32_t. That’s all. Next, compile the code. For this, you can use the icon with the small hammer in eclipse.
  • Last but not least you have to flash the compiled program on the processor by activating the AVR-icon.

Now you are able to use the generated code on the Arduino board. Feel free to modify the statechart model and test the behaviour on the hardware platform.

2. Some explaining words to the code

There are three things you have to implement as glue code, so that the Arduino works:

2.1 CycleRunner

The CycleRunner checks periodically for state changes. It uses timer0 to trigger a completion step periodically. An implementation could look like this:

2.2 Timer and TimeEvent

The Timer class is a representation for the hardware timer and it is responsible for initializing the timer and raising TimeEvents. In this example we are using timer1 from the Arduino to implement the reaction trigger timer. A TimeEvent is triggered when e.g. a transition time expired. There are situations where you have to use several parallel running timer thus it was required to store the timeEvents in an array (see events array in Main.cpp). The Timer checks periodically every 10ms for expired time events and raises them if it’s necessary.

2.3 IO port mapping

At the moment it is necessary to map the IO ports by yourself. You have to define the port direction and define a method to change the signals. For example you can map the red traffic light LED like this:

Now you are able to change the LED signal by using the getter method from the statemachine interface:

 3. Appendix

The movie demonstrates the traffic light with the statechart bahaviour. The designed model is shown in the following image.

Yakindu trafficlight statechart

Traffic light circuit layout

Posted in AVR, embedded systems, itemis, Statecharts, Yakindu | Tagged , , , , , , , , , , , | 3 Comments