Python代写 | EE5132 / EE5024 Assignment

EE5132 / EE5024 Assignment
Sensors-RPi-Node-RED/ThingWorx Stream Analytics
Semester 2, AY2019/20
The I3-SensorHub (I3SH) consists of a set of temperature, light intensity, sound level, humidity and gas
(CO, NO2, methane etc.) sensors connected to a Raspberry Pi (RPi). The RPi may be regarded as an
edge computing node. In the more general case, the sensors can connect to the RPi using wireless IoT
protocols (e.g. LoRa, NB-IoT, Sigfox, Bluetooth, Zigbee etc.), but for simplicity, the abovementioned
sensors are connected to the RPi using wired connections.
The I3SH can be connected to higher level applications such as the Node-RED platform or the
ThingWorx IoT platform using the Message Queuing Telemetry Transport (MQTT) or Representational
State Transfer (REST) protocols.
In this assignment, some groups will (i) connect the I3SH to the dashboard on the Node-RED platform
using MQTT, while other groups will (ii) connect the I3SH to the dashboard on the ThingWorx IoT
platform using REST. Refer to the skeleton Python program and information sheet related to each
The objective is to design an application to classify real-time sensor readings into one of 3 user-defined
classes in either (a) static conditions, e.g. Dim, Normal, Bright (these conditions are user-defined), or
(b) dynamic conditions, e.g. “Getting Brighter”, “Not Changing”, “Getting Dimmer”, or “Object
Approaching”, “Object Same Distance” or “Object Departing”. The sensor readings, prediction
probabilities and prediction/classification result are then sent to and displayed on a dashboard on a
Node-RED or ThingWorx mash-up.
(1) Classification of dynamic conditions require a time series, i.e. multiple sensor measurements over
time, of one or more sensor modalities;
(2) ThingWorx is a powerful IoT platform that can do many things related to IoT; in this assignment,
we are only using its visualization features.
Select either the static or dynamic case that your group would like to address. A group that selects
the static case can only earn up to a maximum of 90% of the total project marks.
Equipment and Software
The RPi in the I3SH hardware has Python 2.7*, numpy, Pandas and scikit-learn packages pre-installed
in the provided software image stored on SD card.
*Note: Python 3 is not used as it is not compatible with the sensor library.
Use your own laptop or PC with Wi-Fi connected to the Internet and an Ethernet port. Connect the
I3SH to the laptop PC using an Ethernet cable. See the connection guide.
Overview of Assignment
This assignment has 3 parts, all implemented on a Python program running on the RPi:
• Make sensor measurements
• Perform classification from the sensor readings
• Send the sensor readings and classification results over MQTT or REST,to Node-RED or ThingWorx,
Areas of Work
Part 1 – Sensor measurements
Verify that the temperature, humidity, light, sound and gas sensors are working and its measurements
can be read on the RPi. Explain how this can be done.
Some gas sensors may not be working; ignore the sensors that are not working.
Part 2 – Fusion and Classification
This part requires fusion of the sensor readings from multiple sensors in order to perform a 3-class
classification task.
Part 2(a)
Code a 3-class classifier from scratch by writing the mathematical expressions in Python, e.g.
maximum a posteriori (MAP) classifier, maximum likelihood (ML) classifier, Gaussian classifier, kNearest Neighbour (k-NN) classifier, naïve Bayes classifier or logistic regression classifier.
Compute the probability estimates for each of the 3 classes.
Part 2(b)
Use one of the classifiers in the scikit-learn (sklearn) library.
Note: Logistic regression and Support Vector Machine (SVM) are binary (2-class) classifiers. Do some
research to find out how to achieve multi-class, e.g. 3-class, classification.
A sample labelled data set is provided which may be helpful when you are building and testing the
Collect a labelled dataset of sensor readings corresponding to the application you are considering,
with at least 30 examples of each class, and use that to train the classifiers.
Hint: For the dynamic case, it is usually beneficial to consider a window of the past T sensor readings,
e.g. T=5, 10 or 20 etc., of each modality, in addition to considering multiple modalities, as the input
feature vector.
Use the trained classifier from part 2(a) to classify incoming sensor readings into one of the 3 classes
under different conditions. Collect enough performance results to enable the compilation of a
confusion matrix and probability of detection (PD) and probability of false alarm (PFA).
Repeat the procedure for the trained classifier from part 2(b).
Part 3 – Send sensor readings and classification results using MQTT/REST for dashboard display
In the case of MQTT, the data is sent to a public MQTT broker using defined topics. Refer to the NodeRED-MQTT information sheet. Node-RED subscribes to the relevant topics in order to receive the data
for display. Verify that the data values are correctly received at the MQTT broker and Node-RED
In the case of REST, the data is sent directly to ThingWorx. Refer to the ThingWorx-REST information
sheet. Verify that the data values are correctly received by ThingWorx before being displayed on the
mash-up dashboard.
In the sample program, the light intensity measurement is sent every three seconds using MQTT or
REST to the dashboard for display.
Send the other sensor modalities that you wish to use and verify that Node-RED or ThingWorx can
receive and display the real-time streaming values on the dashboard.
Node-RED and ThingWorx have been configured to expect the following data values:
Attribute Data type Unit Range
Temperature String – No range restriction
Humidity String – No range restriction
Sound level String – No range restriction
Light intensity String – No range restriction
Class 1 event probability String – 0.0 to 1.0
Class 2 event probability String – 0.0 to 1.0
Class 3 event probability String – 0.0 to 1.0
Prediction String – Predicted class label,
e.g. “Dim”, “Bright”
Time (no format) String – –
Extend the provided skeleton program to implement the necessary aspects to achieve the application
that you have in mind.
Possible Extensions
It can be seen that the dashboards currently do not display the gas sensor measurements. As it is not
straightforward to modify the Node-RED dashboard, use an MQTT program like MQTT-spy or MQTTexplorer to verify that the gas sensor measurements can be correctly received at the MQTT broker.
Similarly, in the case of ThingWorx, verify that the gas sensor measurements can be correctly received.
You may wish to extend the mash-up dashboard to display these measurements.
Report and Presentation
Write a report of about 10 pages (excluding title page and references; about 2 pages per student) on
the streaming analytics application you have designed and the Python program you have written.
Include the mathematical expressions related to the classification schemes you have studied and
implemented and comment on their performance.
Include some screenshots of the Node-RED or ThingWorx dashboard showing all the relevant sensor
measurements and classification results.
The report should be submitted by Wednesday in Week 13. Name the file Grpxx_Assn2_Report.pdf
where xx is the group number, and upload it to the LumiNUS EE5132 Assignment 2 Student
Submissions folder.
On Thursday of Week 13, each group will do a 20-minute presentation of all aspects of the work (each
group member about 4 minutes), and a 5-minute ‘live’ demonstration of the developed system
comprising the I3SH and the dashboard display. Provide a hardcopy of the report and slides to the
evaluator at the start of your presentation session.
After the presentation, the final version of the report, Python program developed, dataset, slides and
references should be zipped in a single zip file with group number in the file name, e.g., and uploaded to the LumiNUS EE5132 Assignment 2 Student
Submissions folder by Friday of Week 13.
You are required to return the I3SH hardware to the Communications Lab at E4-06-12 in the reading
Statement of Contributions
Each member will list down his or her areas of contributions (group and individual aspects) on a single
page and all group members will sign this statement. The template of this statement is provided as a
Word file. This statement should be signed by all members in the group and inserted as the last page
of the report (not counted towards the page limit). A scanned version of this signed page should be
included in the zip file.