Skip to content
daniel-cottrell edited this page May 19, 2024 · 27 revisions

Project - Cu.1 Synchronised Mult-display

Group Members

Felix Pountney - 46986700

In this project, I will be responsible for synchronizing the M5 core2's internal clocks, ensuring all components operate on a unified timeframe. Once complete, I will tackle the custom animation displayed across the 9 nodes. Integration of the accelerometer to capture user movement, in particular, a shake that will pause the animation, allowing for interactive control. Additionally, there is two-way communication with the MQTT server, enabling the project to send and receive messages as needed.

Daniel Cottrell - 46991656

In this project, I will focus on getting the gyroscope sensor readings, to determine the orientation of the M5Core2. Then, I will also implement the capability to adjust the display in that M5Core2 to make sure that it is aligned and continuous with the rest of the displays. I will help out with the GUI side of things too, and show information about the orientation of the M5Core2s.

rsz_662ef76c3a0af

Milestone

Project and Scenario Description

The project we have decided to do is the Synchronised Mult-display. This involves:

  • Using the M5Core2s to create a synchronised display that can show a graphics animation, using 9 M5Core2 displays
  • The array of M5Core2s will be synchronised, in such a way that it appears as a single display
  • Use Zephyr on the M5Core2
  • Use the wifi protocol

For further details, refer to the Key Performance Indicator section below.

Key Performance Indicators

The KPIs for this project are:

  1. Time synchronisation of M5 Core2

This is measured through a combination of the data displayed on the GUI and the visual synchronisation of the animations between different M5 Core2’s. This shows the success of using MQTT to transfer information, JSON to package data, and the time synchronisation algorithm. The following performance threshold would be used,

Time delay Condition
<150ms Good
<800ms Satisfactory
>800ms Improvement Needed
  1. Gyroscope rotation reaction

This would be measured through visual changes of flipping on an animation on a display as the device is rotated 180 degrees. This index would show the ability to read the gyro sensor and convert this into information about the orientation. The following performance threshold would be used,

Time delay Condition
<500ms Good
>500ms Improvement Needed
Rotation Angle Detection Error Condition
<10° Good
<40° Satisfactory
>40° Improvement Needed
  1. PC GUI visualisation of the state of 9x M5 core2 display

Measured through reaction of the GUI showing current time synchronisation, orientation and animation being displayed. This shows the implementation of MQTT and JSON on the computer program. The following performance threshold would be used,

  • Able to display M5 core 2 orientation
  • Able to display current animation used
  • Able to show shake registered on accelerometer and which M5 core 2 registered
  • Able to show time synchronisation statistics
  1. PC GUI interaction/control of animation and display order

Measured through the ability to change animation and display order through the GUI. This shows the ability to publish to MQTT, as well as for the M5 Core2 to adjust display order in real-time. The following performance metrics will be used,

  • Able to change animation shown on the display grid
  • Able to adjust M5 core2 order in display grid
  • Able to manually adjust time-synchronisation statistics
  1. Accelerometer shake registration and reaction

Measured through a pause or playing of animation when a shake is registered. It shows the ability to read the accelerometer sensor as well as synthesise this data into registering the action of a shake. The performance would be measured by the reliability of registering a shake and the time to pause the animation using the following performance metrics,

Time delay Condition
<500ms Good
>500ms Improvement Needed
Accuracy of shake detection Condition
>85% Good
>65% Satisfactory
<65% Improvement Needed
  1. Animation consistency

Measured through the consistency of the time between changes in animation frames and will show the use of Zephyr, in particular, the use of a multi-threaded design so that the animation doesn’t get held up on MQTT read, writes or external interrupts such as gyro reading. The following performance threshold would be used,

Time Between Animation Change Condition
<50ms Good
<200ms Satisfactory
>200ms Improvement Needed

System Overview

The system overview involves showing how the hardware components interact with each other, and a top-level flowchart of the software implementation of them. This is in the diagrams below.

System Hardware Interactions:

hardware_architecture

PC Implementation:

PC

M5Core2 Implementation:

M5Core2

Sensor Integration

The sensors used are in the M5Core2s, which are the gyroscope and accelerometer sensor. They are used to determine the orientation of the M5Core2, as well as the acceleration. They are integrate by sending information to the PC, which processes it and sends back an updated display for the M5Core2s. This is outlined in the diagram underneath.

system_integration

Wireless Network Communication

Network Topology

The network topology will consist of Wi-Fi communication via two separate MQTT topics. The first is for time synchronisation, which will be used by the nine m5Core2, and the second is for commands and status updates, which will be used by the computer and the nine m5Core2s.

IoT Protocol

The IoT Protocol used will be MQTT with JSON encoding for both of the MQTT topics.

Data Rate

The MQTT JSON packet includes between 20-62 Characters or between 20-62 bytes of information when using both ASCII and UTF-8 encodings. This results in a data transfer rate ranging between 40-124 bytes per second, as two packets per second are sent during peak communication.

Message Protocal Diagram

MQTT Time Synchonisation

MQTT Shake Ackowledge

Algorithm Schemes

No AI or machine learning algorithms are used for this project, since the displays don't need to be able to learn anything, and instead only rely on the gyroscope and accelerometer data. However, MQTT and IoT will be used in this project to transfer information back and forth.

DIKW Pyramid Abstraction

An example scenario that the system can operate in is a billboard display or advertisements, which use multiple displays to get people's attention but also need to be synchronised to look good.

DIKW_pyramid_abstraction

Equipment

The equipment used in this project are the 9 M5Core2s, and a PC to display a GUI of real-time information received.

Week 11

This week, we worked on getting the sensor values from the M5Core2s, and printing out the gyroscope and accelerometer data. We also set up the basis for the MQTT communication of all the M5Core2s, and looked at how to synchronise the animation through time control.

Week 12

This week, we set up the GUI that will display the information, and worked out a JSON protocol for sending messages that the GUI could pick up via MQTT and display. We also fine-tuned the system, adding the capabilities that were outlined in the KPIs.

Project configurability and Field Deploying Plan

Outline of steps taken to ensure that the project firmware and hardware can be (re) configured for additional features:

  • The firmware is modularised into distinct sections for sensor reading, displaying the GUI, network communication, and time synchronisation
  • USB interfaces can be used for initial setup and configuration, which can load settings onto the M5Core2s and troubleshoot hardware issues

Outline of steps taken to ensure that the project can be deployed in the field by non-project members:

  • Pre-configure the units based on its specific position in the display array
  • Ensure that a reliable Wifi access point is available, and test the network coverage and signal strength
  • Power on the M5Core2 units, using the PC GUI to check synchronisation and sensor readings