Sunday 31 May 2015

i.MX6SX - UDOO NEO Early hardware prototype

A few weeks back I received an early hardware prototype of the UDOO NEO. It hosts one of the newer members of the i.mx6 family, the i.MX 6SoloX which integrates a Cortex A9 with a Cortex M4. This seems to be Freescales first venture with a heterogeneous SOC, interestingly there may more down the road with the introduction of the i.mx7 (Cortex A7 + Cortex M0/M4).

What is striking about the i.mx6sx architecture is that the primary processor is the Cortex A9. Therefore to boot the i.mx6sx requires uboot or (another i.mx6 bootloader) to be present. Once the A9 is active the secondary processor (Cortex M4) can be brought on-line, either through uboot or after the A9 starts running Linux. In either case, application code (and/or bootloader) has to be made available to the M4 via on board flash or loaded into RAM. The M4 is brought on-line via a Reset following the standard Cortex M4 start up process of booting from a vector table at address zero (which in the A9 world is mapped to 0x007f8000). From what I understand address zero seems to be 32K of TCM (Tightly-coupled memory). The M4 seems to be capable of interacting with most of the on-board peripherals provided by the i.mx6sx. Other areas of interests on the i.mx6sx are :
  •  An semaphore mechanism (SMEA4) to aid processor contention 
  • A messaging mechanism (MU) to enable the two processors to communicate and coordinate by passing messages
The i.mx6sx isn't intended as multimedia processor and this is highlighted by the lack of a VPU and the inclusion of a low spec GPU, the Vivante GC400T which claims up to 720p@60 fps and is limited to Open GL ES 2.0.

What is unusual is the inclusion of the video analogue to digital converter (PAL/NTSC). Given that Freescale target market is Automotive I suspect this is for applications like reverse parking.

In it's current form the NEO has a small footprint  (850mm x 600mm)  and the notable on board peripherals are:
  • Ethernet (10/100) - KSZ8091RNA
  • Wifi/bluetooth - WL1831MOD
  • HDMI  - TDA19988
  • Gyroscope - FXAS21002C
  • Accelerometer + Magnetometer - FXOS8700CQ
  • 2 User defined LED's
  • 1 USB 2.0 type A female connector
The FXAS21002C and FXOS8700CQ are currently interfaced through i2c and the datasheets indicate a maximum clock speed of 400Khz (in Fast Mode).

 

Software for the NEO is still in early stages of development, so it's taken me a couple of weeks of development effort to get uboot and a kernel functioning. Above is a short video demonstrating the fruits of my efforts. On power up we launch a small application on the M4 from uboot which toggles (flashes) the Red LED and continually outputs a message on Uart 2 (shown later in the video). We then boot the kernel to a Ubuntu rootfs and launch an Open GL ES application on which outputs a rotating square to hdmi (720p).  The speed of rotation increases the more the board is tilted to the left or right (using the accelerometer). Note the Red LED continually toggles highlighting the fact that the cortex M4 is uninterrupted by the A9.

Having started to develop code for the TDA19988 interface, one useful feature of this controller is the ability to support DVI or HDMI displays. The downside being that the GPU struggles past 720p in my tests.

Another point to note is that there is no on board flash for the M4. Therefore code for the M4 has to be loaded into RAM which means reserving a small portion away from kernel. This introduces another layer of complexity as the reserved area needs to be protected from accidental access.

Developing for the board definitely gives you a good insight in its usability. This leads to my wish list, if the board were to be improved :

1. 2 x USB 2.0 type A connectors, for keyboard & mouse useful when used with HDMI.
2. 2 x Ethernet (definitely make it stand out from the competing SBCs).
3. Use SPI for FXAS21002C & FXOS8700CQ or ideally replace with 9 axis IMU.
4. I'd prefer the sensors to be on a separate pcb that can be connected via a jump cable to the main board. The main reason being that manoeuvring the board to test the sensors with the cables attached (ethernet, serial, power, usb, hdmi) isn't that practical.
5. On board JTAG support, it is very difficult to debug the M4 code without it!
6. There are two uart outputs, one for A9 serial console and other for the M4 on the NEO. Similar to the UDOO it would be very useful if these were exposed via single usb connector. In the current design you need to hook up 2 serial-to-usb adapters.

Wednesday 15 April 2015

IOT - ESP8266 (ESP-201) + CC1110 (XRF)

Overall past few months there has been a huge amount of interest in the ESP8266 which offers a low cost Serial to WIFI solution. The introduction of the ESP8266 by Espressif coincidences with the increased hype of IOT (Internet of Things) as the next major emerging technology. Realistically IOT has been around for a decades in different disguises so the concept isn't new. I'm guessing the renewed interested is partly because sensors, processors and networks are dirt cheap and wireless technology is ubiquitous. Coupled with the rise of the maker culture it now means the cost of entry is low for connecting devices.

The are numerous modules available that host a ESP8266 and these can be found for a few dollars, however most have limited pins broken out, UART plus some GPIO. In my opinion a better option is the EPS-201 which has a larger number of pins broken out and an external U.FL antenna connector (IPX antenna). With a slight modification the module can be made breadboard compatible.



The main drawbacks with the ESP-201 are:

1. Lack of CE/FCC compliance (the ESP8266 is not the module).
2. The pin marking are on the underside of PCB therefore not visible when sitting in a breadboard.
3. The UART pins protrude on the underside of the PCB which means it can't be plugged into a breadboard without bending these pins at a right angle (as shown) or ideally de-soldering the pins and re-attaching the correct way round.



To evaluate the ESP8266 I chose to integrate it with an existing wireless (868Mhz) set up which is used for monitoring a number of sensors as well providing periodic temperature readings.  The existing radio set up uses Ciseco's XRF  which hosts TI's low power CC1110 transceiver. It runs custom developed firmware that is controlled through one of the two CC1110 UART ports. The plan was extended the firmware so that CC1110 could be controlled over TCP by running a socket server on the ESP8266. Compared to the ESP8266 the CC1110 has a wealth of reference documentation which increases the options for interfacing with other devices in addition to easing the programming burden. The main drawback with the CC1110 is the cost of the development tools although it possible to use SDCC (Small Device C Compiler) as an alternative.

Initially I was hoping to use I2C/SPI as the interface between the CC1110 and the ESP8266. However due to the CC1110 not supporting hardware I2C and coupled with the fact that I had just a few free I/O pins remaining I was left with one option that was to use the second UART port.


Espressif provide an SDK that can be flashed to the ESP8266 which provides a simple AT command style interface to program the device. Note, the SDK is continually being updated so check for later releases. One quirk with the ESP-201 is that is IO15 has to be grounded for the device to function. To flash the device IO00 has to be grounded. Instructions for SDK set up on Linux can be found here. To flash the AT firmware for SDK 1.0 on Linux we can issue the following command:

esptool.py write_flash 0x00000 boot_v1.2.bin 0x01000 at/user1.512.new.bin 0x3e000 blank.bin 0x7e000 blank.bin

After flashing the ESP-201 the bulk of the coding was done on the CC1110 which mainly entailed sending AT commands to the ESP-201 to initial a connection to the,  launch a socket server and send updates from the sensors. The sequence of AT commands was similar to this:

AT+RST
ATE0
AT+CWMODE=1
AT+CWJAP="SSID","Your Password"
AT+CIPMUX=1
AT+CIPSERVER=1,80

After coding up the AT commands on the CC1110 I could test by launching a telnet session on port 80 to the ip address allocated via DHCP from the AP. Output sensor data from both the CC1110 UART port and ESP-201 is shown below.




Coding the above highlighted a number of pitfalls with the AT firmware and hardware.

  • It can be tedious to parse the verbose and inconsistent responses returned by the ESP8266 to AT commands. To tone down the verbose responses  I used ATE0, however its not permanent so needs to be sent on a reset.
  • Resetting (AT+RST) or joining an access point (AT+CWJAP) can be slow therefore you need to carefully select relevant time out values.
  • STA mode (AT+CWMODE=1) can silently disconnect after a random time.
  • The ESP8266 isn't particularly well suited as a battery powered device because it can consume up to 300mA.

It is possible to write your own firmware instead of using the pre-built AT firmware, which in my opinion is a better option. Espressif provide a set of closed sourced C libraries which offers a finer level of control compared to the AT firmware. Having  spent a considerable amount of time writing custom firmware to interface to the CC1110, here's my findings:

  • Although there is a second UART available on the ESP8266, in most circumstances only the TX pin available (primary use is for debugging) because the RX pin is used as part of the SPI interface for the flash memory.
  • There is no in-circuit debugging option, your reliant on sending tracing output to the UART port or somewhere else.
  • Although SSL support is provided, it seems to be a hit and miss affair between SDK versions.
  •  The API is closed source, so your reliant on Espressif providing regular updates for new features or bug fixes.
  • No hardware encryption.
  • Not all I/O features are available eg RTC or I2C.

Given the amount of attention the ESP8266 has received it is fair to say it does offer a low cost and rapid approach to prototyping a WIFI solution for existing hardware or for a new application. However you could argue that most of the attention has come from hobbyists and not commercial ventures. In my opinion I think it is worth exploring other WIFI SOC's that coming to the market this year such as:


Furthermore it still not clear whether WIFI (2.4GHz or 5GHz) is the ideal medium for wireless IOT as the wake up and connect times aren't particular quick. The other point to make is that cost of the some of the above SOCs can make them overlap with traditional networking SOC which are used in low cost router boards. One example is the AR9331 which supports a full Linux stack and can be used for video streaming or complex routing something the WIFI SOC's may find hard to achieve.

Sunday 25 January 2015

RK3288 - Firefly development board

I received this board just over a month ago from the Firefly team and have been keen to assess it development capabilities given its hosts a quad core Cortex A17 (or technically a Cortex A12) processor. 


On board is a Rockchip RK3288 processor which on initial glance has a pretty decent specification:

  1. 4Kx2K H.264/H.265(10-bit) video decoder
  2. 1080p H.264 video encoder
  3. Up to 3840X2160 display resolution
  4. 4Kx2K@60fpsHDMI2.0
  5. eDP 4Kx2K@30fps
  6. According to Rockchip the GPU core is listed as a Mali-T764 GPU although it's reported as a Mali-T760 by the Mali drivers.
  7. Ethernet Controller Supporting 10/100/1000-Mbps 

Given the above I think it is import to clarify what 4Kx2K actually means and the clue is in point 3. Having spent many hours debugging the kernel driver display code it turns out the RK3288 has 2 display controllers know as VOP_BIG and VOP_LIT (I presume for little). VOP_BIG support a maximum resolution of 3840x2160 which equates to 4K UHD (Ultra high definition television) and for VOP_LIT its 2560x1600 (WQXGA). Each controller can be bound to a display interface eg HDMI, LVDS or eDP (sadly missing on the firefly).  If you define 4Kx2K as 4096x2160 also know as DCI 4K then the definitions can be misleading. The numbers also align with H264/VP8/MVC decoding which max at 2160p@24fps (3840x2160), although the HEVC(H265) decoder contradicts this by  supporting 4k@60FPS (4096x2304). What is also interesting is the image processing engine can up scale to 3x input, which would imply 720p can be up scaled to 4K UHD.

The Firefly seems to be based on a Android TV box reference design and it could be argued that its targeted as an Android centric development board. The noticeable peripherals are:

1. VGA support + HDMI
2. On board microphone
3. Wifi Connector (RF Connector SMA Female)
4. IR Sensor
5. eMMC 16Gb

Firefly supply a pre-built dual boot image (on eMMC) that offers Android 4.4
and Ubuntu 14.04.

Android 4.4


Firefly supply the Android SDK sources so that customised images can be built from source. What is nice is that the Android Studio integrates well with the Firefly which eases development of Android Apps especially for newcomers. Furthermore the App can be remote debugged while running on the Firefly. I  suggest that you sign your App with the platform key from the SDK to ease integration when remote debugging. One pitfall to bear in mind is that Android 4.4 implements selinux so you may find accessing I/O via sysfs (eg GPIO) from your Android App is severely restricted. 

Ubuntu 14.04


The Ubuntu image uses a patched version of the Android kernel and unfortunately has no support for GPU/VUP acceleration.

Linux Support


Historically many ARM SOC vendors have side stepped any request for providing meaningful Linux support and instead rely on the developer community to progress this as far as they can. Unfortunately the situation is normally exacerbated by the lack co-operation for GPU/VPU support with SOC vendor. What's clearly not recognised by ARM is that this fragmentation is clearly benefiting Intel with their latest lower power SOC's having far superior out of box Linux support.

As of today I would argue the RK3288 falls midway between no and full Linux support. The reason for this is Rockchips effort to develop a Chromebook device, if you examine the Chromium OS source tree will find numerous patches submitted from Rockchip

So the question becomes can we make use of that source tree? Well my initial aim was to get a minimal kernel booting and ideally test if GPU support was possible. The video below shows the progress made after numerous weeks of attempting to bring a workable kernel. In the video I launch Openbox under X and run es2gear,glmark-es2 and some WebGL samples in Chromium.



Although the video may seem impressive the only GPU acceleration available is through EGL/GLES hence the WebGL examples are accelerated. What is important to bear in mind is that the xf86-video-armsoc driver lacks 2D support for Rockchip therefore this still is fair amount of software rendering in X11 plus I implemented workarounds for broken code. Furthermore performance isn't particular spectacular, es2gear & glmark-es2 numbers are here. Unfortunately I haven't had the time to investigate the cause(s) however further progress may be hinder by the lack of newer Mali drivers from Rockchip/ARM.

For those of you expecting a future Rockchip Chromium OS image to work on an existing RK3288 device you may be disappointed unfortunately the hardware differences make this a slim possibility.