Android Multi-Sensor System: Design, Testing, And Results

by Square 58 views
Iklan Headers

This article provides a comprehensive guide to the design, testing, and evaluation of a PC-orchestrated Android multi-sensor system, focusing on the technical aspects and system performance, without involving real physiological experiments. We'll delve into the essential elements that should be included in Chapters 4, 5, and 6 of an MSc thesis, ensuring a high mark (≥80) by meeting standard UK MSc thesis expectations.

1. Diagrams: Architecture, Timing, and Data Flow

In Chapter 4, dedicated to implementation, detailed diagrams are crucial for illustrating the system's design and operation. These diagrams provide a visual representation of the system's components and their interactions, which helps the reader understand the system's architecture and how data flows through it. Understanding the architecture of your Android multi-sensor system is crucial. These diagrams include:

1.1. Overall System Architecture Diagram

This diagram serves as a high-level overview of the entire system, showcasing the PC Orchestrator, Android device, and various sensors—including the Topdon TC001 thermal camera, Shimmer3 GSR sensor, and RGB camera—along with their interconnections. It meticulously illustrates the communication pathway between the PC and the Android app via a custom TCP/IP link, as well as the Android app's interfaces with the Topdon TC001 (utilizing USB or OTG), Shimmer3 GSR (employing BLE), and the phone's integrated RGB camera. Specifically, this system architecture diagram serves to clarify the roles of hardware and software components and their relationships, offering a clear representation of how different elements interact within the multi-sensor system. For example, it highlights the function of the PC software module, the Android sensor manager, and the storage module, providing a cohesive understanding of the system's architecture.

1.2. Data Flow Diagram/Pipeline

A data flow diagram is essential for mapping the journey of data from each sensor, through the Android application, to its final storage location, or potentially back to the PC. This diagram, whether structured as a flowchart or a UML activity diagram, delineates critical steps such as sensor signal acquisition, timestamping, buffering, the synchronization process, and data logging on the device, including any data acknowledgment sent back to the PC. By illustrating these processes, the data flow diagram ensures readers understand how multi-sensor data streams are managed in parallel and synchronized within the system, offering a clear view of the data pipeline from acquisition to storage.

1.3. Session Sequence Diagram (PC-Android Interaction)

The session sequence diagram illustrates the message exchange and events that occur during a recording session between the PC orchestrator and the Android app. For example, this sequence diagram shows how the PC orchestrator initiates the process by sending a "Start Session" command, prompting the Android app to initialize sensors like the thermal camera, GSR sensor, and RGB camera. Consequently, the sensors begin streaming data, followed by periodic sync signals or timestamps exchanged to maintain synchronization. The PC then sends a "Stop Session" command, which cues the Android app to cease sensor activity and close the session. Emphasizing the custom TCP protocol commands and the precise timing of operations, this diagram demonstrates how the PC orchestrator manages and coordinates the session from beginning to end. Thus, the session sequence diagram provides a detailed view of the interaction between the PC and Android devices during a recording session.

1.4. Time Synchronization Timeline Diagram

To understand how time synchronization is achieved, a timeline diagram is essential. This time synchronization timeline diagram illustrates the synchronization process between sensor data streams. For instance, it shows a timeline aligning sensor readings, such as thermal frame timestamps versus GSR timestamps, against a common clock. It also depicts the use of any synchronization algorithms or timestamp offset mechanisms, like the PC sending its time to Android or the Android aligning sensor times to a reference. This diagram validates that multi-sensor data is properly time-aligned and highlights any latency between the PC command and actual sensor start times. Therefore, the time synchronization timeline diagram plays a crucial role in validating the synchronization accuracy of the multi-sensor system.

1.5. Internal Module Diagram (Android App Architecture)

An internal module diagram of the Android application is useful in showing the app's internal structure and modules. Consider including a component or class diagram illustrating modules like Sensor Managers for thermal, GSR, and camera data, a Time Sync module, a Data Logger, and a Network Communication module. This provides technical readers with insight into the software design and demonstrates a well-structured implementation, showing how the code is organized for maintainability and concurrency. Thus, the internal module diagram provides valuable insights into the Android app architecture and how various modules interact to ensure the system’s functionality.

1.6. State Machine Diagram (Session Control)

For systems with complex session control logic, a state machine diagram can be useful. The state machine diagram can clarify how the system transitions through states on start/stop commands and error conditions, emphasizing robustness in design. The state diagram illustrates the session orchestration logic (e.g., Idle → Connecting → Recording → Terminating → etc.), possibly for the PC orchestrator or the overall system. This diagram is useful for showing the orchestration of system status, which is crucial for maintaining a reliable and robust multi-sensor system.

2. Tables: Performance, Configuration, and Coverage Summaries

Using tables is essential to present system configurations and evaluation results concisely. By incorporating several tables in Chapters 4–6, you can effectively summarize key parameters and outcomes. These tables will greatly enhance the clarity of the thesis.

2.1. System Configuration and Sensor Specification Table

A system configuration table summarizes the hardware and software setup of the system. Columns for each sensor (TC001 thermal, Shimmer3 GSR, RGB camera) list key parameters (resolution 512Ă—384 for thermal, sampling rate of GSR in Hz, frame rate of RGB video, communication interface, etc.), as well as the Android device model, OS version, and PC specs. This provides context for the performance results and assures the examiner that the system configuration is clearly documented. This table offers a clear and concise overview of the system's hardware and software components, which is vital for understanding the system's capabilities and performance metrics.

2.2. Custom Protocol Command Table

Defining the custom TCP protocol messages used by the PC orchestrator and Android app requires a dedicated custom protocol command table. Each row lists a Command Name (e.g., START_SESSION, STOP_SESSION, SYNC_TIME, HEARTBEAT, etc.), its purpose/description, and any parameters. This table typically appears in Chapter 4 to supplement the sequence diagram, ensuring the reader understands the communication interface and enabling reproducibility. The custom protocol command table provides clear and structured information, enabling a comprehensive understanding of the system’s communication protocols.

2.3. Test Plan Coverage Table

The test plan coverage table maps each test or evaluation scenario to specific system requirements or components, demonstrating thorough coverage. Rows could be Test Cases (unit, integration, system tests), and columns indicate which subsystem or requirement is verified by that test. This table shows a systematic approach to validation and helps convince the examiner that all critical aspects have been verified (even without live experiments). Including this in the Testing chapter (Chapter 5) shows a systematic approach to validation and helps convince the examiner that all critical aspects have been verified (even without live experiments).

2.4. Performance Results Tables

Several performance results tables summarize quantitative results from the synthetic and log-based evaluations. These tables show:

  • Latency and Synchronization Accuracy Table: Measurements of the delay between PC command and actual recording start, as well as any clock offset/drift observed between sensors. For example, it might list initial time offset, offset after sync correction, and drift after X minutes for each sensor stream.
  • Data Throughput and Storage Table: How much data each sensor produces (e.g., thermal frames per second & frame size, GSR samples per second, etc.), the total data logged per minute, and any bottlenecks. This can also include the achieved logging duration (e.g., how long the system can record continuously) and file sizes, illustrating the system's capacity.
  • Resource Utilization Table: Summarizing CPU usage, memory usage, and possibly battery consumption on the Android device during operation. For instance, columns for CPU % (avg/max) for the app, memory footprint, and a note on battery drain per 10 minutes of recording. This proves the system's efficiency and viability on mobile hardware.

2.5. Test Results Summary Table

A test results summary table lists each major test scenario (unit/integration/system test cases or synthetic experiments) alongside the expected outcome and actual outcome. For example: "Integration Test 1: PC-Android Connection Recovery -- Expected: reconnection within 2s after dropout; Actual: Passed, reconnected in 1.5s". This provides an easy-to-scan verification that all planned tests passed or notes any failures with explanations. It also adds to the credibility of the evaluation, showing transparency in reporting results. This table clearly presents the outcomes of each test scenario, demonstrating whether the system performs as expected under various conditions.

2.6. Coverage/Traceability Table

A coverage/traceability table traces project objectives or requirements to the sections of the report or tests where they are addressed. This could be included in Chapter 6 (Discussion/Conclusion) to explicitly demonstrate that all project aims were met and evaluated. While not strictly required, such a table can impress examiners by highlighting completeness (e.g., Objective: "Achieve time sync <50ms" -- Achieved: demonstrated in Section X, Test Y).

3. Tests: Unit, Integration, and System Tests

Chapter 5 should detail a rigorous testing strategy, even in the absence of live human experiments. Include descriptions of unit tests, integration tests, and system-level tests that were implemented. Testing is crucial for ensuring the reliability and accuracy of your Android multi-sensor system.

3.1. Unit Tests

Unit tests focus on individual modules of the system in isolation. For example:

  • Thermal Camera Interface Unit Test: Verify that the Android app's thermal module correctly captures frames at the expected rate and format using a stub or simulated TC001 camera input. Check that frame timestamps are recorded and that the frame data can be saved without error.
  • GSR Sensor Data Parser Unit Test: Ensure the parsing logic correctly interprets byte streams into GSR values with timestamps by simulating incoming BLE packets from the Shimmer3 GSR.
  • Time Sync Algorithm Unit Test: Verify the algorithm computes the correct adjustments for time alignment by testing it with synthetic timestamps and known offsets or drifts.
  • TCP Protocol Message Unit Tests: Prevent communication errors during integration by constructing and parsing sample protocol messages to verify that the PC and Android sides both encode/decode messages correctly.
  • Data Logging Unit Test: Test the logging component on Android by feeding it sample sensor data (in-memory) and verifying that it writes the correct format to storage.

3.2. Integration Tests

Integration tests cover the interaction between multiple components together, usually run on the actual devices (or emulators) in a controlled manner:

  • PC-Android Communication Integration Test: Verify that the TCP connection is established and commands trigger actions on the phone by simulating a session initiation.
  • Multi-Sensor Synchrony Integration Test: Verify that the timestamps in the logs from each sensor align closely by running the system with all sensors enabled (thermal, GSR, RGB) for a short session using synthetic data inputs.
  • Data Storage and Retrieval Integration Test: Ensure no data loss in the pipeline from capture to storage by verifying that the data stored on the Android device is complete and correctly formatted after a recording.
  • Error Handling Integration Test: Test robustness by intentionally inducing or simulating common error conditions to verify the system handles these conditions gracefully.

3.3. System Tests

System tests are high-level tests of the entire system's behavior in realistic conditions (using synthetic data or a controlled environment since no real biomedical experiment is done):

  • Full Session Simulation Test: Evaluate overall system stability by conducting a complete end-to-end run where the PC orchestrator initiates a session, all sensors record for a specified duration, then stop.
  • Stress/Load Test: Monitor for any performance degradation, memory leaks, or time drift over long durations by pushing the system to its limits using synthetic high-rate data or longer duration recording.
  • Cross-Platform Connectivity Test: Ensure the TCP protocol handles minor packet losses or delays by testing the network protocol across different network conditions.
  • User Interface & Control Test: Verify that the user controls correctly reflect the system state if the PC orchestrator or Android app has a GUI.

4. Validation Methods: Correctness, Robustness, Sync, and Completeness

Chapter 5 should explain how the system is validated using controlled or synthetic methods to prove it meets its design goals. Key validation procedures include:

4.1. Functional Correctness Validation

Verify the system using known inputs and expected outputs. For example, feed a synthetic GSR signal with a predictable pattern through the Shimmer device or a simulator, and check that the logged data matches the pattern exactly. Similarly, use a dummy thermal image sequence to ensure the system records frames in the right order and none are corrupted.

4.2. Time Synchronization Validation

To validate sync accuracy without external reference equipment, generate a synchronization event observable by all sensors and measure timing. For instance, programmatically insert a timestamped marker or trigger in both data streams, and then compare the timestamps of these markers in the thermal versus GSR log to compute the difference.

4.3. Robustness and Fault Tolerance Validation

Deliberately test how the system handles faults. For example, simulate a sensor disconnect and verify the system continues running and logs an error rather than crashing. Or test network interruption by disconnecting the WiFi or killing the TCP connection briefly—the system should either pause and resume or safely stop with an error message.

4.4. Completeness Verification

Ensure that all intended data is indeed captured and none is lost or dropped silently. This can be validated by simple counting and cross-checking: if the GSR sensor is known to send 10 samples per second for 60 seconds, the log should have ~600 entries; if the thermal camera runs at 8 Hz for 60 seconds, about 480 frames should be saved.

4.5. Performance Validation

Validate that performance metrics meet expectations. For example, check that CPU usage stays below a certain threshold by using Android's profiling tools or logging system stats during a test run. Validate that the logging rate can keep up with incoming data by monitoring internal queues or observing that there are no noticeable delays in timestamps.

4.6. Absence of Physiological Data Impact

Since synthetic data is used, explicitly mention that validation focuses on system behavior rather than signal interpretation. For completeness, one could argue this is actually a strength: the system's correctness can be validated independently of human variability. Emphasize that each sensor's data is treated generically, so the validation with test patterns is sufficient to prove the system would handle real data just as well.

5. Results and Outputs: Presenting Key Result Types

Chapter 5 (and Chapter 6 for discussion) should present clear results backed by the above tests and validations. Include all necessary result types to convince the reader of the system's performance and reliability.

5.1. Example Log Snippets

Include small snapshots of the actual recorded data logs as figures or in-text monospace blocks to show what the data looks like. For instance, a few lines from the GSR data file with timestamps and values, alongside a thermal frame timestamp log or file name, to illustrate how data is timestamped and maybe the formatting (e.g., CSV columns).

5.2. Synchronization/Drift Results

Report the measured time drift or sync error between sensors. This could be a numeric result (e.g., "Clock drift remained under 5 ms after 10 minutes of recording") or a small plot. If possible, include a graph of timestamp offset over time for one sensor relative to another or relative to the PC clock.

5.3. Performance Metrics Results

Present the data from performance tests, likely via tables or graphs. Key results would be:

  • Latency: E.g., the delay between sending a start command and actual data onset. If measured, report average and max latency.
  • Throughput: E.g., "Thermal camera delivered ~8 FPS consistently; GSR sensor ~32 samples/sec; no samples/frames were dropped at the logging end."
  • CPU/Memory/Battery: Report, for example, "CPU usage on the Android device averaged 45% (single core) during recording, with no thermal throttling observed" or "Memory usage remained stable at 100 MB, indicating no memory leak."

5.4. Validation Test Outcomes

Summarize the outcomes of the correctness and robustness checks. For example, state the result of the synthetic input tests: "When a sine-wave test signal (0.1 Hz) was input to the GSR module, the same waveform was observed in the logged data (Pearson correlation ~1.0 between input and output), confirming end-to-end accuracy."

5.5. Screenshots/Images

It might be helpful to include a screenshot of the Android app or the PC orchestrator interface during a session to give the reader a visual sense of the running system. Alternatively, an example thermal image frame captured by the system can be presented (with appropriate color palette) to show the actual data being handled. Ensure to caption it properly (e.g., "Example thermal image frame captured by the Topdon TC001 via the Android app").

5.6. Overall System Demonstration Results

Summarize the system's achievements with a combination of results in the discussion or conclusion (Chapter 6). For instance, include a table or concise summary stating that the system met all requirements: e.g., "Multi-sensor recording achieved; Synchronized to within 5ms; No data loss over 15 min; CPU usage under 50%; All 20 planned tests passed."

5.7. No Biomedical Conclusions Drawn

Emphasize in results that, since no real physiological experiment was performed, the results focus purely on system performance rather than any findings about human data. For instance, instead of discussing GSR trends or thermal patterns, the results stick to technical metrics (timing, accuracy, performance).

5.8. Discussion of Results

Alongside presenting raw results, Chapter 6 should include an interpretation—e.g., Why is the sync error small (referring to the method used), What the resource usage implies (feasibility on a smartphone), and Any limitations observed (for example, if the thermal camera frame rate was lower than expected or if any drift was noticed over very long sessions). Being upfront about these in the results discussion shows critical evaluation, which is essential for a top grade. Also mention how the synthetic tests give confidence that the system would handle real signals similarly, and suggest that future work could involve actual experiments now that the system is validated.

By including all the above diagrams, tables, tests, validation methods, and result types in Chapters 4–6, the thesis will comprehensively document the system's design and thoroughly evaluate its performance. This approach meets the standard expectations for an MSc project and demonstrates a high level of technical competence and critical analysis, positioning the work for a strong (80%+) evaluation.

In conclusion, this detailed approach to documenting and evaluating your Android multi-sensor system ensures a thorough and high-quality thesis. Good luck!