Building a “Hello World” for self-driving labs: The Closed-loop Spectroscopy Lab Light-mixing demo

Summary Learn how to build a Closed-loop Spectroscopy Lab: Light-mixing demo (CLSLab:Light) to perform color matching via RGB LEDs and a light sensor for under 100 USD and less than an hour of setup. Our tutorial covers ordering parts, verifying prerequisites, software setup, sensor mounting, testing, and an optimization algorithm comparison tutorial. We use secure IoT-style communication via MQTT, MicroPython firmware on a pre-soldered Pico W microcontroller, and the self-driving-lab-demo Python package. A video tutorial is available at https://youtu.be/D54yfxRSY6s. For complete details on the use and execution of this protocol, please refer to Baird et al.1


SUMMARY
Learn how to build a Closed-loop Spectroscopy Lab: Light-mixing demo (CLSLab:Light) to perform color matching via RGB LEDs and a light sensor for under 100 USD and less than an hour of setup. Our tutorial covers ordering parts, verifying prerequisites, software setup, sensor mounting, testing, and an optimization algorithm comparison tutorial. We use secure IoT-style communication via MQTT, MicroPython firmware on a pre-soldered Pico W microcontroller, and the self-driving-lab-demo Python package. A video tutorial is available at https:// youtu.be/D54yfxRSY6s. For complete details on the use and execution of this protocol, please refer to Baird et al. 1

BEFORE YOU BEGIN
The protocol below describes how to set up Closed-loop Spectroscopy Lab: Light-mixing Demo (CLSLab:Light), a ''Hello, World!'' for a ''self-driving'' (i.e., autonomous) laboratory (SDL) 2 using a Pico W microcontroller, LEDs, a light sensor, and Bayesian optimization. CLSLab:Light incorporates key principles for SDLs including sending commands, receiving sensor data, physics-based simulation, and advanced optimization. This ''Hello, World!'' introduction is accessible to students, educators, hobbyists, and researchers for less than 100 USD, a small footprint, and under an hour of setup time. For a full video build tutorial, please refer to https://youtu.be/D54yfxRSY6s. There are some deviations between the instructions in the YouTube video build tutorial and recent versions of the self-driving-lab-demo Python package. In particular, see steps 13 and 14.

Order required parts
Timing: 5 min (not including shipping time) 1. Order the parts: (https://www.digikey.com/short/qztj2jt7 AND Pico W with pre-soldered headers) OR https://www.digikey.com/short/vtzjbvr2. A visual summary of parts is given in Figure 1. Note: For the first option, the total is 68.61 USD (or 73.72 USD including optional parts) + shipping as of 2022-03-06.
Note: The authors plan to periodically check and update the ''DigiKey Order" link at https:// hackaday.io/project/186289-autonomous-research-laboratories in case of part shortages or deprecation. Note: The sculpting wire needs to be 14 gauge (2 mm) or thinner, including the insulation jacket, and rigid enough to support the sensor. The sculpting wire is only used for mounting purposes, not to conduct electricity. Sculpting wire is also available at Amazon. Approximately 3 0 is required. See problem 5.
Note: The purpose of the wall adapter is so that, after initial setup, the demo can be powered standalone where communication happens purely via Wi-Fi.
Note: The hardware and software was designed to work with the Pico W, though the setup can be adapted for other microcontrollers. See problem 1.
Note: The bill of materials, not including the sculpting wire, is also available at Adafruit.

Additional prerequisites
Timing: N/A Note: The purpose of using a wireless connection rather than a hardwired one is to capture the principles behind ''cloud experimentation'', where the host and the client may be separated by large geographical distances. Additionally, this allows for a computer to only be required for initial setup such that the device can function standalone, waiting to receive commands and send sensor data. This captures best practices of a scaled-up cloud-accessible lab or network of labs. For more context, see https://github.com/sparks-baird/self-driving-lab-demo/discussions/91 and https:// github.com/sparks-baird/self-driving-lab-demo/discussions/62. For links to a simple example using a wired connection and related discussion, see problem 2.  CRITICAL: If you use a mobile hotspot, you may need to use your device's ''extended compatibility'' feature to drop the mobile hotspot from 5G to 2.4 GHz. See also prepaid, long-expiry hotspot and classroom demos with standalone network access discussions, which includes a summary of recommendations for prepaid mobile hotspots.
2. Ensure access to a computer (for initial setup only).
Note: At a minimum, the computer needs to be able to run the Thonny editor (lightweight) and it must have at least one USB-A port.
3. If the headers are not already soldered onto the microcontroller, ensure access to a soldering iron and soldering wire (thinner is better in this case).
Optional: Ensure the Pico W can successfully connect to a computer by holding the BOOTSEL button on the Pico W while connecting the Pico W to your computer via the USB cable. If a new drive appears, that indicates that the Pico W is working normally. Note: If soldering, be careful to only heat the gold pads to avoid damaging the circuitry.

Hardware setup
Timing: 20 min Unless pre-soldered, attach the headers onto the Pico W, mount the light sensor so that the pinhole is facing the red green blue (RGB) LED, connect the light sensor to the board, and get the microcontroller ready for firmware installation. Note: This setup will allow the position and orientation of the sensor to be both adjustable and steady.
4. Continue twisting until you have 4-6 inches of twisted wire, and ensure that there are at least 3 inches of loose, untwisted wire at each end. See Figure 2 and Methods video S1.
Note: (the leftover, untwisted wire will be threaded through the mounting holes of the light sensor in the next step). For a more modular alternative of fixturing the wire ends to the Maker Pi Pico base, see problem 4.
5. Thread the same sculpting wire through the AS7341 light sensor and position the sensor so the pinhole is facing approximately 3-4 inches away from the RGB LED. See Figure 4 and Methods video S2. 6. Connect the Grove/Stemma-QT connector into Grove port 6 (GP26&27) and the AS7341, insert the SD card (optional), insert the Pico W, and while holding the BOOTSEL button, connect the Pico W to the computer. See Figure 5 and Methods video S3.  b. When installing, use the default settings: "Standard (default)". Thonny comes with its own version of Python located by default at C:\Users\<username>\AppData\Local\Programs\Thonny\python.exe on Windows computers. c. It is not anticipated that this will cause conflicts with existing installations of Python; however, for conda users, an isolated installation may be performed via the following commands in a conda shell: 8. Click on the lower-right dropdown and click "Install MicroPython", which will install the microcontroller firmware onto the Pico W. See Figure 6 and Methods video S4. 9. Choose "MicroPython variant: Raspberry Pi -Pico W / Pico WH" and click install. See Figure 7 and Methods video S4. 10. Change the interpreter from Local Python 3 to MicroPython (Raspberry Pi Pico), which will open a shell that can be used to enter MicroPython commands that run directly on the Pico W. See Figure 8 and Methods video S4. 11. In Thonny's menubar, click "View" then "Files" to open a sidebar which shows both your local computer's files (top) and the files on the Pico W (bottom). See Figure 9 and Methods video S5. 12. Download sdl_demo.zip from the latest release at self-driving-lab-demo to your computer and unzip it. See Methods video S5. 13. In Thonny, navigate to the unzipped sdl_demo folder, open secrets.py, enter your Wi-Fi network name (SSID) and password as Python strings, and save secrets.py. See Figures 10 and 11, and Methods video S5.
Optional: you can create your own MongoDB Atlas database and enter values for MONGODB_API_KEY, MONGODB_COLLECTION_NAME, and DEVICE_NICKNAME into secrets.py (see below).
Optional: you can create your own HiveMQ instance and enter secrets.py credentials for HIVEMQ_USERNAME, HIVEMQ_PASSWORD, and HIVEMQ_HOST (see below).
a. Set up a MongoDB database backend.
Note: If ignored, the demo will function, just without logging data to a database (i.e., the user becomes responsible for saving the data on the client side). See problem 3.
i. Create an account at https://www.mongodb.com/cloud/atlas/register. Protocol ii. Create a free, Shared Cluster. See Figure 12.
com/app/<data-abc123> /endpoint/data/v1'' where <data-abc123> is the app name. See Figure 15. ix. Copy the app name into the MONGODB_APP_NAME variable in secrets.py. x. Click ''Create API Key'', enter a name of your choice (e.g., clslab-light), and click ''Generate API key''. See Figure 16. xi. Copy the API key and store it somewhere secure, then paste the API key into the MONGODB_API_KEY variable in secrets.py. b. Create your own HiveMQ instance.
Note: If this setup is ignored, the demo will function properly; however, the hardware commands and sensor data will be transmitted via a default HiveMQ instance for which the credentials are public. Setting up your own HiveMQ instance ensures that the data you transfer remains private and secure. Other MQTT brokers such as Mosquitto or Adafruit IO are available. At the time of writing, we recommend HiveMQ because it provides free instances with generous limits. Setting up a private MQTT broker is in line with best practices for internet of things (IoT) security and should be used especially when working with sensitive data.
v. Enter the server address (i.e., HIVEMQ_HOST) into secrets.py and run the Google Colab cells. vi. Follow the instructions to download the hivemq-com-chain.der file to the unzipped sdl_demo folder. 14. Upload files to the Pico W microcontroller. See Figure 19 and Methods video S6.
a. While holding Ctrl (Windows) or Cmd (Mac), select "lib", "main.py", ''hivemq-comchain.der'', and "secrets.py" Note: hivemq-com-chain.der is not mentioned in the YouTube tutorial, as it was not implemented at the time of creating the video.
b. Right click in the gray region. c. Click "Upload to /". 15. Double click to open main.py, click the green play button (i.e., run the code on the Pico W), and note the PICO ID that prints to the command window ("prefix/picow/<PICO_ID>/"). See  Note: This will act as the ''password'' to control the demo.

Control from the cloud
Timing: 10 min  18. Copy the PICO ID from the Thonny editor and paste it in place of "test" (without quotes). See Figure 22. An example image of the output is given in Figure 23. See also Methods video S8.
Note: the actual output to the command window may vary in future releases.
Note: If you leave PICO_ID set to ''test'', this will control a public demo maintained by the authors for testing and demonstration purposes. The authors will strive to keep this public test demo available for the foreseeable future with minimal downtime.
19. Run the remaining code cells. See Methods video S8, S9, and S10. a. Instantiate a SelfDrivingLabDemo class. b. Perform optimizations for grid search, random search, and Bayesian optimization.

EXPECTED OUTCOMES
It is expected that users will successfully set up the hardware and software for a closed-loop experiment. Further, users will run their first ''autonomous drive'' given in an example interactive notebook and explore additional example notebooks. Figure 24 shows a comparison of optimization results for grid search vs. random search vs. Bayesian optimization averaged over repeat campaigns with standard deviation error bands, where Bayesian optimization, on average, performs the best. Figure 25 shows one of the outputs from the cloudbased control notebook of best error so far vs. iteration number comparing grid search vs. random search vs. Bayesian optimization. Typically, grid search is the least efficient, Bayesian optimization is the most efficient, and random search is somewhere in-between. Figures 26, 27, and 28 show the points that were searched for a given campaign for grid search, random search, and Bayesian optimization, respectively. Finally, Figure 29 shows the true, underlying target color (defined by red, green, and blue values) and the best parameter set based on minimizing error between the observed spectrum and the target spectrum for each of the optimization methods.

QUANTIFICATION AND STATISTICAL ANALYSIS
Discrete Fré chet distance, as implemented in https://github.com/cjekel/similarity_measures, is used to assess the mismatch between the currently observed spectrum and the target spectrum, where the target spectrum is determined by arbitrarily choosing a random set of RGB values and measuring the sensor data for the fixed, random set of RGB values. Lower Fré chet distances correspond to better matches between the observed and target spectra (i.e., lower error).
An example JSON document logged to a MongoDB database backend containing experimental data for a single run is given as follows: The experimental parameters for two JSON documents are given in Table 1.

LIMITATIONS
Environmental noise (e.g., light conditions) and hardware variation (LED, sensor, sensor positioning, etc.) may affect the results obtained.

TROUBLESHOOTING
See the GitHub issue tracker for existing known issues or to post a new issue. See the GitHub discussions for general questions and discussion.

Problem 1
Can I use this with alternate microcontrollers or firmware?

Potential solution
The hardware configuration and software were designed based on Raspberry Pi's Pico Wireless (Pico W) microcontroller. Libraries exist for LED control and the AS7341 light sensor in CircuitPython and Arduino. The hardware and configuration and software can be adapted for other microcontrollers. Contributions at https://github.com/sparks-baird/self-driving-lab-demo/ are welcome. See order required parts.

Problem 2
Can I use this without connecting to the internet?

Problem 3
Can I use this without logging to a MongoDB backend?

Potential solution
If the MongoDB credentials are left to their default dummy values in secrets.py, then logging to the MongoDB backend will fail and the device will simply notify the user rather than exit the program. In  other words, the device will function normally without database logging. The same applies for logging to an onboard SD card. If an SD card is detected, the microcontroller will write backup data to it, otherwise this step will be skipped. See step 13.a.

Problem 4
The Stemma-QT to Grove connectors (or other items) are out-of-stock.

Potential solution
First, look at Adafruit and other vendors to see if it is available. Note that Cat#1528-4424-ND is incompatible with the Maker Pi Pico base due to the adapter housing blocking it from being plugged in fully. If no Stemma-QT to Grove connectors can be located, another alternative is using a Stemma-QT to header pin cable (DigiKey Cat#1528-4209-ND) and plugging directly into the GPIO pins that correspond to Grove Port #6 of the Maker Pi Pico base. For other items that may be out of stock on DigiKey or Adafruit, other vendors may be used (e.g., AS7341 light sensor from electromaker). See order required parts.

Problem 5
The sculpting wire doesn't fit through the mounting holes.

Potential solution
Ensure that the outer diameter of the sculpting wire is 14 AWG or higher (i.e., 1.628 mm or thinner). Enameled wire (often advertised as sculpting wire) has a very thin coating, whereas electrical wiring typically has a non-negligible insulation thickness. Optionally, for a more modular setup, a single M2.5 binding post (Digikey Cat#36-8737-ND) can be used to clamp the wire via a single mounting hole instead of looping the wire through each of the mounting holes. See order required parts and step 3.

Problem 6
My SD card isn't being recognized.

Potential solution
First, we note that use of the micro SD card is optional and serves the purpose of onboard backup data logging. The 128 MB micro SD card recommended in this work (DigiKey Cat#1528-5250-ND) has been tested with the rest of the components. First, try removing the micro SD card completely and reinsert it, making sure there is an audible ''click''. If the microcontroller fails to detect the micro SD card, then there may be a defect in the micro SD card or the Maker Pi Pico base. Try ordering an extra micro SD card (same one recommended above), and if it suddenly works, you should be able to request a refund on the first SD card. If it still does not work, contact the seller of the Maker Pi Pico base to request a replacement. If not using the recommended SD card, the card formatting may be incompatible with MicroPython (see https://github.com/CytronTechnologies/MAKER-PI-PICO/issues/4). In this case, you will likely need to purchase a different type of SD card. See order required parts and step 3.

RESOURCE AVAILABILITY
Lead contact Further information and requests for resources and reagents should be directed to and will be fulfilled by the lead contact, Taylor D. Sparks sparks@eng.utah.edu.

Materials availability
This study did not generate new unique reagents.

Data and code availability
The code generated during this study is available on GitHub: https://github.com/sparks-baird/ self-driving-lab-demo. The recommended option for ordering parts is https://www.digikey.com/ Figure 29. The true, underlying RGB target (purple diamond) and the best observed points for grid search (blue circle), random search (red circle), and Bayesian optimization (green circle) Bayesian optimization gave the closest match to the true target. The LED parameters are red (R), green (G), blue (B). The sensor settings are atime, gain, astep (affects integration time and intensity). The measured output values are of the form ''ch###'' where the three digit number corresponds to the full-width half-max (FWHM) wavelength being measured.