applied

: In this paper, an illumination measurement system is proposed and experimentally demonstrated. The system consists of two parts, including the illumination acquisition module mounted on the UAV and the real-time display interface of the cloud platform with control functions. The illuminance acquisition module consists of a light sensor, a distance sensor, a wireless communication module, and a power-supply component. For the other part, the OneNET cloud platform developed by China Mobile is chosen as the display interface for the system’s real-time data. In addition, the local-outlier factor (LOF) algorithm is used for outlier rejection to further improve the stability and accuracy of the system. The illuminance acquisition system designed and implemented was then applied to the illuminance measurement experiments in the stadium. In experiments, the illuminance data acquired by the system was used to investigate the illuminance distribution at different height levels in the stadium. It was learned that the root mean square error (RMSE) of the data acquired is calculated to be 2.45 compared to the illuminance standard values through experimentation, with an error range of − 2.77% to − 0.53% for dynamic measurements.


Introduction
With the development of the economy and society, the number of sports grounds has risen rapidly and sports venues that can host major events are being built continuously. When an international event will be hosted, illumination requirements usually cover not only sports areas but also audience areas. According to the General Lighting Standards for the Olympic Games, the average vertical illumination of the cameras should be no less than 25% of the playing area, and the illumination should be reduced evenly to less than 10% compared with the illumination of the playing areas in the audience area from row 12 onwards [1]. In general, comprehensive stadiums are divided into six levels according to the Sports Stadium Lighting Design and Inspection Standards (JGJ153-2016) [2], and venue lighting levels are generally above level IV because they have to adapt to a variety of competition events [3]. In class IV stadiums, the ratio of horizontal minimum to maximum illuminance should be 0.3 and the ratio of horizontal minimum to average illuminance should be 0.7. Meanwhile, when broadcasting is required, the vertical illumination of the primary and secondary cameras should also be taken into account. The vertical illuminance in the direction of the main camera should be 1000 lx, while the vertical illuminance in the direction of the secondary camera should be 750 lx [4]. With this requirement, the measuring range may be very large. In addition, it can be difficult to measure vertical illuminance. In addition, when the illumination is measured in swimming competition venues, the measurement process may be inconvenient because of venue restrictions.
Faced with measurement on a large scale and complex scenarios, traditional measurement methods often take a lot of time and face big challenges. It is an excellent solution to the problems of taking advantage of the convenience of flying with UAVs, with the maturity of UAV technology and the continuous development of UAV application research. Initially, UAVs were invented for military use. In recent years, the acquisition of low-cost sensors and platforms has laid the foundation for the civilian use of UAVs, with the increasing maturity of metal electromechanical systems and microsensor technologies [5]. And a growing number of related aspects of research and commercial applications are emerging due to the convenience of UAVs [6,7]. Many studies and discussions on UAVs continue to uncover more potential for drone application technology [8,9]. Colomina and Molina et al. [10] applied UAVs to photogrammetry and remote sensing by extracting information from images and generating 3D data; Wefelscheid et al. [11] used UAVs to help with 3D modeling, reconstructing the 3D shape of a building or area. Boccardo et al. [12] conducted a study on how UAVs can be used in disaster detection to map the situation and provide up-to-date information after a catastrophic event or before an anticipated event. There are also some people who have done much research to study how to measure illumination in a quicker and more convenient way using UAVs. Takaya Maemura, et al. [13] use a PHANTOM UAV with an illumination sensor to obtain data. In the paper, the authors measured the horizontal and vertical illumination in the gymnasium. The feasibility of the gymnasium illumination measured by UAV is demonstrated. In addition, they also discussed the illumination of the gymnasium in three dimensions, and the flow of light inside the gymnasium space. In fact, UAVs are already being used in many areas of measurement. Sensor-mounted UAVs also have been used in a variety of measurement applications [14,15]. However, illuminance, an important light environment parameter, has been relatively little studied in applications using UAV measurements, and most of the applications in UAVs are based on the functionality of the drones themselves, which is less integrated with increasingly sophisticated automation and IoT technologies. Thus, the mounting system for UAVs can be made more intelligent and convenient with the widespread development of IoT and sensor technology.
In this paper, the sports venue illumination measurement system presented is based on the UAV with the function of optical flow. The illumination acquisition system consists of the STM32F103RCT6 micro control module, illumination acquisition module, distance measurement module, and wireless transmission module. The local-outlier factor (LOF) algorithm is chosen as the system outlier handling method confronted with anomaly values in the collected data [16,17]. In the experiment, we obtained the local-outlier factor (LOF) algorithm training data by modeling a standard sports stadium and determined the outlierdetection threshold. According to the experiments, the root mean square error (RMSE) of the illuminance data acquired by the system and the illuminance standard value is 2.45, and the error range for dynamic measurements is between −2.27% and −0.57%.

Illumination Acquisition System Architecture
The visualization function is defined as spectral luminous efficiency to represent the visual sensitivity of the human eye at different wavelengths, according to the human eye vision properties. Since stadium illuminance is mainly suitable for athletes and spectators, the process of measuring sports stadium illuminance needs to make the response of the detection equipment consistent with that of the human eye. And the illuminance sensor, which is the main information input for the dynamic measurement system, needs to have the characteristics of fast measurement speed and high measurement accuracy. The photodiode illumination sensor is chosen as the system's illumination acquisition component, which is made with ROHM Semiconductor's BH1750FVI chip. It is commonly used in electronic devices and projects that require light detection and measurement. Key features of the BH1750FVI include its compact size, low power consumption, and digital output. It uses a built-in 16-bit analog-to-digital converter (ADC) to provide accurate light intensity readings. The sensor is capable of measuring a wide range of light levels, from very low to high intensities. As a real-time acquisition system mounted on a UAV, the system should have low power consumption and be small in size, and it also needs to have high real-time performance. The STM32 chip made in STMicroelectronics is used as the central processing unit of the operating system proposed in this paper. The STM32 model chosen for this paper is the STM32F103RCT6 in order to facilitate the mounting, which is characterized by small size, low power consumption, and fast processing speed. Furthermore, it can be powered by the FreeRTOS operating system, which has higher real-time performance as a result of the multithreaded operation through time-slice rotation. The system also has the function of transmitting illumination data to the cloud platform wirelessly, which is based on the WiFi communication module, named ESP826 made in Espressif Systems [18]. The total data-transfer rate of the ESP8266 model is designed for 2 Mbps, and the model number is ESP8266-01S.
WiFi wireless transmission technology is easy to network, and no wiring is required for data transmission [14]. Meanwhile, the data-transmission distance can reach about 100 m. To cope with illuminance measurements at different height levels, the laser distance sensor is selected. In this paper, the distance sensor of the type is ToF Sense-UART, which can be easily communicated with different modules. OneNET IoT platform is selected as a data-receiving platform in wireless transmission that supports adaptation to various network environments and protocol types. The IoT platform has rich functions, including fast access to various sensors and smart hardware, rich APIs, application templates to support the development of various industry applications, and so on.
As for the UAV-carrying platform, the HUBSON ZINO2+ UAV was chosen as the platform of illumination acquisition. It is a professional-grade UAV that can hover accurately and stably indoors or outdoors, and it can load with a weight of 500 g. The UAV supports a variety of flight modes, such as variable speed and constant speed flight. In addition, the mobile app provides real-time feedback, recording the single flight time and hovering position. The HUBSAN ZINO2+ UAV used in this experiment has an optical flow positioning function, which is mainly used to determine the position information of the indoor environment. When the UAV is indoors, the optical-flow navigation system, which is a positioning method that has been used in UAV positioning and control systems in recent years, can determine the current position information based on the information acquired by a specific camera that is on the tail of the UAV. The structure of the illuminance acquisition module is as Figure 1. As shown in Figure 1, the system is structured in three main layers, including UAV, illumination acquisition module, and cloud platform. The UAV is used as a working platform for the entire illuminance acquisition system and plays an important role in illuminance acquisition as well as in the positioning of measurement points. The illumination acquisition module is used as the data input and control command-execution part of the system, consisting of an illumination sensor (BH1750FVI), a WiFi transmission section (ESP8266), a laser distance measurement module (ToF Sense), and a Li-ion battery power supply. The illuminance acquisition module can get illuminance data every 180 ms. As shown in Figure 1, the system is structured in three main layers, including UAV, illumination acquisition module, and cloud platform. The UAV is used as a working platform for the entire illuminance acquisition system and plays an important role in illuminance acquisition as well as in the positioning of measurement points. The illumination acquisition module is used as the data input and control command-execution part of the system, consisting of an illumination sensor (BH1750FVI), a WiFi transmission section (ESP8266), a laser distance measurement module (ToF Sense), and a Li-ion battery power supply. The illuminance acquisition module can get illuminance data every 180 ms. The individual parts of the module transmit data to each other via the different communication protocols, where data transfer between the illuminance sensor and the STM32F103RCT6 is based on the I2C communication protocol, and command delivery and data transfer between ESP8266 and STM32F103RCT6 via UART serial communication protocol, and STM32RCT6F103 also can control the laser distance-measurement module to complete the height-measurement judgment. When the UAV illumination acquisition system is used to acquire illumination in the sports venue, the distribution of illumination varies at different heights, as proved by the inverse squared ratio. Therefore, a distance measurement module is required to be carried in the system. The ToF Sense can perform distance measurement with a refresh frequency of 30 HZ, and the measuring distance is from 0.03-8 M with an accuracy of ±0.03 M. In addition, the ToF Sense's small size and low power consumption make it very suitable for use as a sensor on the UAV. The communication between the various parts of the system is shown in Figure 2.

System Software Design
The system proposed in this paper uses the Keil uVision5 integrated development environment, including designing the program to complete the illumination acquisition and setting the ESP8266 module to operate in the appropriate mode to connect to the cloud platform and transfer the data. For the programming in this paper, the FreeRTOS realtime operating system is chosen as the basis for the programming. The FreeRTOS realtime operating system is free and highly real time, with a compact kernel and open-source source code. The FreeRTOS operating system is based on a task, which is a program entity that completes a segment for a specific purpose, to complete the overall system program design. And the design of the overall system is completed by switching between the priority of each task and the control of the four task states. The state transitions between the various task states of the FreeRTOS real-time operating system are shown in Figure 3.

System Software Design
The system proposed in this paper uses the Keil uVision5 integrated development environment, including designing the program to complete the illumination acquisition and setting the ESP8266 module to operate in the appropriate mode to connect to the cloud platform and transfer the data. For the programming in this paper, the FreeRTOS real-time operating system is chosen as the basis for the programming. The FreeRTOS real-time operating system is free and highly real time, with a compact kernel and open-source source code. The FreeRTOS operating system is based on a task, which is a program entity that completes a segment for a specific purpose, to complete the overall system program design. And the design of the overall system is completed by switching between the priority of each task and the control of the four task states. The state transitions between the various task states of the FreeRTOS real-time operating system are shown in Figure 3.
As shown in Figure 3, tasks in the FreeRTOS operating system are divided into four states, called ready, block, running, and suspend. Whenever a task is created successfully, it is automatically put in the ready state, and if the current task has a higher priority than the running task, it enters the running state, but if the priority of the current task is not higher than the priority of the running task, the current task will enter the ready state. When a task in the running state calls the function associated with vTaskDelay(), the task switches to the blocking state. A task in a blocking state will not be able to execute and be called again, and it will end up in the blocking state when the blocking condition is met or when the event time change in the diagram occurs. And in the FreeRTOS operating system, the task of all the tasks in the ready state, which is with the highest priority, goes into the running state. In addition, when the vTaskSuspend() function is called, the task will be converted to a pending state and the pending task will not be scheduled indefinitely, If the task is rescheduled, it can only be unmounted with the vTaskResume() function. As shown in Figure 3, tasks in the FreeRTOS operating system are divided into four states, called ready, block, running, and suspend. Whenever a task is created successfully, it is automatically put in the ready state, and if the current task has a higher priority than the running task, it enters the running state, but if the priority of the current task is not higher than the priority of the running task, the current task will enter the ready state. When a task in the running state calls the function associated with vTaskDelay(), the task switches to the blocking state. A task in a blocking state will not be able to execute and be called again, and it will end up in the blocking state when the blocking condition is met or when the event time change in the diagram occurs. And in the FreeRTOS operating system, the task of all the tasks in the ready state, which is with the highest priority, goes into the running state. In addition, when the vTaskSuspend() function is called, the task will be converted to a pending state and the pending task will not be scheduled indefinitely, If the task is rescheduled, it can only be unmounted with the vTaskResume() function.
For the system presented in the paper, when the system is first powered up, it will first detect if it has received a signal to start illumination measurement, and when a signal to start illumination measurement is detected, it will turn on illumination measurement. Before starting the illuminance measurement, the height of the illuminance measurement needs to be specified via the cloud platform. When the UAV takes off, the illuminance acquisition system will compare the specified measurement height received with the current height, and the system will start collecting illuminance if the UAV illuminance acquisition system is within ±0.1 m of the specified measurement height. The collected illuminance data is then transmitted to the OneNET cloud platform, which can plot the illuminance collection variation curve in real time based on the data received. Once the illuminance has been collected, the local-outlier factor algorithm is used to first reject the outliers and then calculate the average of the illuminance data collected at the same point to determine the illuminance value at the point. The program-flow diagram of the illumination acquisition system proposed in this paper is shown in Figure 4. For the system presented in the paper, when the system is first powered up, it will first detect if it has received a signal to start illumination measurement, and when a signal to start illumination measurement is detected, it will turn on illumination measurement. Before starting the illuminance measurement, the height of the illuminance measurement needs to be specified via the cloud platform. When the UAV takes off, the illuminance acquisition system will compare the specified measurement height received with the current height, and the system will start collecting illuminance if the UAV illuminance acquisition system is within ±0.1 m of the specified measurement height. The collected illuminance data is then transmitted to the OneNET cloud platform, which can plot the illuminance collection variation curve in real time based on the data received. Once the illuminance has been collected, the local-outlier factor algorithm is used to first reject the outliers and then calculate the average of the illuminance data collected at the same point to determine the illuminance value at the point. The program-flow diagram of the illumination acquisition system proposed in this paper is shown in Figure 4.

System Outlier Handling Algorithm
There are some outliers in the acquired data due to the instability of the drone hovering and flying in the process of illuminance acquisition by the system and changes in the external environment can also greatly interfere with the measurement of the UAV.

System Outlier Handling Algorithm
There are some outliers in the acquired data due to the instability of the drone hovering and flying in the process of illuminance acquisition by the system and changes in the external environment can also greatly interfere with the measurement of the UAV. Data acquired by a UAV-based illumination acquisition system have the characteristics of a random distribution, which is not a linear distribution, and not a Gaussian distribution. The detection of outliers in such data has important implications for the stability and accuracy of the system. These outliers can lead to large inaccuracies in measurement results, so it is necessary to find and eliminate outliers in measurements. Outliers are defined as data that are distinctive in the dataset so that they make one suspect that these data are not random deviations but are generated by a different mechanism according to Euclidean distance. The specific definition of outliers is as follows based on the description of Euclidean distance. The acquisition set to R = {r(x1), r(x2), . . . , r(xn)} and S = {f(x1), f(x2), . . . , f(xn)} is generated by two mechanisms, which are defined as r(x) and f(x) respectively, according to the data acquisition defined as D = {x1, x2, . . . , xn}. And then owned by is defined as outliers in the case of that M(xi) M(xj) (i = j) under the condition that f(xi) is not equal to r(xi), with the defined as an outlier feature.
As shown in Figure 5, big data objects follow a certain distribution law, but the data x is clearly deviated, so it can be considered that it is produced by a different mechanism and judged as an outlier. In general, outliers can be divided into global outliers, local outliers, situational (or conditional) outliers, and collective outliers. In the design of the illumination measurement system, the LOF algorithm is selected as the detection method of outliers in this paper. The local-outlier factor is mainly used to determine whether the sample is anomalous by calculating the outlier factor and comparing whether it is far from dense data. The local-outlier factor is a density-based local-outlier detection algorithm, and the specific implementation is shown below. The d(o, p) is defined as the distance from point p to point o; d k (o) is defined as the k-distance, which is to radiate outward with p as the center of the circle until the k neighboring point is covered; and N k (o) is the k-distance neighborhood of data point p defined by the set of all points within the k-distance of point p. The reach_dist k (o, p) is defined as shown in Equation (1).   The distance definition diagram is shown in Figure 6. The local reachable density is defined in Equation (2) based on the above-defined representation. The local reachable density is defined in Equation (2) based on the above-defined representation.
The lrdk(p) characterizes the density of the point p, and it is known the higher the density of point p and the surrounding points, the reachable distance of each point may be the smaller respective k-distance, corresponding to a larger lrd value; the lower the density of point p and the surrounding points, the reachable distance of each point may be the larger actual distance between two points, corresponding to a larger lrd value. In addition, the local-outlier factor can also be defined as Equation (3) by using the lrd value.
From Equation (3), we can see that the local-outlier factor of point p is the ratio of the average locally reachable density of all points in the Nk(p) neighborhood of point p to the locally reachable density of point p. When this ratio is greater than 1, the density of point p is less than the density of its surrounding points, and point p may be an outlier; when this ratio is less than 1, the density of point p is greater than the density of its surrounding points, and point p may be a normal point. The density value can be infinity if the number of duplicates is greater than the number of k neighbors; therefore, the weighted localoutlier factor (WLOF) is defined if the data contains duplicates. The details are as follows. The lrd k (p) characterizes the density of the point p, and it is known the higher the density of point p and the surrounding points, the reachable distance of each point may be the smaller respective k-distance, corresponding to a larger lrd value; the lower the density of point p and the surrounding points, the reachable distance of each point may be the larger actual distance between two points, corresponding to a larger lrd value. In addition, the local-outlier factor can also be defined as Equation (3) by using the lrd value.
From Equation (3), we can see that the local-outlier factor of point p is the ratio of the average locally reachable density of all points in the N k (p) neighborhood of point p to the locally reachable density of point p. When this ratio is greater than 1, the density of point p is less than the density of its surrounding points, and point p may be an outlier; when this ratio is less than 1, the density of point p is greater than the density of its surrounding points, and point p may be a normal point. The density value can be infinity if the number of duplicates is greater than the number of k neighbors; therefore, the weighted local-outlier factor (WLOF) is defined if the data contains duplicates. The details are as follows.
As shown in Equation (4), the w(o) is the number of duplicates in the data. After computing the weight values, the algorithm treats each set of duplicates as one data value. The calculation of the weighted local outlier is shown in Equation (5) In the use of the LOF algorithm, it is necessary to first determine the threshold of the LOF algorithm. In the paper, we used DIALux evo version 11.1 to model the venue and make it comply with the Venue Illumination Standards specified by the Standard for lighting design and test of sports venues (JGJ 153-2016). In the venue model scenario, a uniform light distribution is set, with an illuminance uniformity of 0.62, and it also can meet the standard for lighting design and the requirements for the illuminance of a venue with televised coverage [19]. When the UAV is acquiring data, the UAV will sway within a small area of the measurement point, according to the performance of the drone in actual measurements, so this paper chose a 20 cm × 20 cm rectangular calculation surface to simulate the dynamic range of the UAV, and the illuminance uniformity of 25 points in each rectangle (the ratio of minimum illuminance to average illuminance) is about 0.99 through DIALux simulation. We have also taken full account of the representativeness of the calculation surface about the selection of the small calculation surface; therefore, the planes were set at different illumination intervals with reference to the equivalence curve of the illumination distribution. In terms of height, we selected a calculation surface every 5 cm height for a total of 5 calculation surfaces and determined the illuminance calculation surface at a height of 1 m, 2 m, 3 m, and 4 m separately, to meet the requirements of measuring different height illumination values in different competition venues.
The selection of calculation surfaces within the model and the results of the localoutlier factor detection are shown in Figures 7 and 8 In the use of the LOF algorithm, it is necessary to first determine the threshold of the LOF algorithm. In the paper, we used DIALux evo version 11.1 to model the venue and make it comply with the Venue Illumination Standards specified by the Standard for lighting design and test of sports venues (JGJ 153-2016). In the venue model scenario, a uniform light distribution is set, with an illuminance uniformity of 0.62, and it also can meet the standard for lighting design and the requirements for the illuminance of a venue with televised coverage [19]. When the UAV is acquiring data, the UAV will sway within a small area of the measurement point, according to the performance of the drone in actual measurements, so this paper chose a 20 cm × 20 cm rectangular calculation surface to simulate the dynamic range of the UAV, and the illuminance uniformity of 25 points in each rectangle (the ratio of minimum illuminance to average illuminance) is about 0.99 of (a) (b)  As shown in Figure 7, The horizontal axis represents the local-outlier factor value calculated from the illuminance data points in different data acquisition planes. As we can see, most of the local-outlier factor values are concentrated within 0.5, indicating that the data density is relatively high, and it also shows that there is very little variation in illuminance in the plane of calculation. In illuminance measurement, points within a small plane are specified as measurement-point illuminance values, and the measurement points are faced with the situation of calculating the uniformity of illumination. Furthermore, the vertical axis represents the number of points that obtained different local-outlier-factor values. We removed 10 percent of the data as the threshold to determine the data to remove the effect of excessive differences in illuminance values between different calculation planes. In Figure 7, the blue vertical line is the result of the threshold calculated for the different height planes of data, where the results are 0.63, 0.69, 0.51, and 0.65, respectively. As shown in the picture, the points to the right of the blue vertical line are the discarded data in the dataset. Due to the large differences in illuminance values between different high calculation planes, they cannot be used as criteria for the determination of outliers, so the LOF algorithm determines the threshold value so that it detects the 5 percent of training observations as anomalies.
With the use of the algorithm, the nearest point of the selected point is calculated according to the maximum-variance method. The smaller value is selected in the first divided area, and then the algorithm can again look for the smaller value in the second divided region until the smallest value is found in the calculation process. The flow chart of the LOF algorithm is shown in Figure 8.

System Implementation and Testing
We have built the illumination acquisition system based on the chosen hardware and the method mentioned above. In the process of building the system, it is necessary to consider the location of the system so that there is no obstruction during the illumination measurement. Therefore, the illumination acquiring system is mounted on the top of the UAV, and it is powered by a 3.7 V, 1800 mAh battery because the illumination acquisition system requires enough power to function properly and to transmit data over long distances. Size and weight should also be taken into account to make the illumination acquisition system be carried on a UAV platform. In the system, the width of the physical hardware is 11 cm, and the length is 17 cm. The physical hardware design is shown in Figure 9. As shown in Figure 7, The horizontal axis represents the local-outlier factor value calculated from the illuminance data points in different data acquisition planes. As we can see, most of the local-outlier factor values are concentrated within 0.5, indicating that the data density is relatively high, and it also shows that there is very little variation in illuminance in the plane of calculation. In illuminance measurement, points within a small plane are specified as measurement-point illuminance values, and the measurement points are faced with the situation of calculating the uniformity of illumination. Furthermore, the vertical axis represents the number of points that obtained different local-outlier-factor values. We removed 10 percent of the data as the threshold to determine the data to remove the effect of excessive differences in illuminance values between different calculation planes. In Figure 7, the blue vertical line is the result of the threshold calculated for the different height planes of data, where the results are 0.63, 0.69, 0.51, and 0.65, respectively. As shown in the picture, the points to the right of the blue vertical line are the discarded data in the dataset. Due to the large differences in illuminance values between different high calculation planes, they cannot be used as criteria for the determination of outliers, so the LOF algorithm determines the threshold value so that it detects the 5 percent of training observations as anomalies.
With the use of the algorithm, the nearest point of the selected point is calculated according to the maximum-variance method. The smaller value is selected in the first divided area, and then the algorithm can again look for the smaller value in the second divided region until the smallest value is found in the calculation process. The flow chart of the LOF algorithm is shown in Figure 8.

System Implementation and Testing
We have built the illumination acquisition system based on the chosen hardware and the method mentioned above. In the process of building the system, it is necessary to consider the location of the system so that there is no obstruction during the illumination measurement. Therefore, the illumination acquiring system is mounted on the top of the UAV, and it is powered by a 3.7 V, 1800 mAh battery because the illumination acquisition system requires enough power to function properly and to transmit data over long distances. Size and weight should also be taken into account to make the illumination acquisition system be carried on a UAV platform. In the system, the width of the physical hardware is 11 cm, and the length is 17 cm. The physical hardware design is shown in Figure 9. In order to determine the stability of the illuminance acquisition system during dynamic measurements, in this paper we use the illuminance acquisition system and the SPIC-200 (manufactured by Hangzhou Yuanfang Optoelectronics Co., Ltd., Hangzhou, China) illuminance meter to obtain illuminance separately. The illumination measurement is taken at night in the closed laboratory, the influence of the external light environment on illuminance measurement experiments can be reduced to a minimum. During the experimental process, illuminance values were measured twice for each point using the SPIC-200 illuminance meter and illuminance acquisition module. The illuminance data is shown in Figure 10. In order to determine the stability of the illuminance acquisition system during dynamic measurements, in this paper we use the illuminance acquisition system and the SPIC-200 (manufactured by Hangzhou Yuanfang Optoelectronics Co., Ltd., Hangzhou, China) illuminance meter to obtain illuminance separately. The illumination measurement is taken at night in the closed laboratory, the influence of the external light environment on illuminance measurement experiments can be reduced to a minimum. During the experimental process, illuminance values were measured twice for each point using the SPIC-200 illuminance meter and illuminance acquisition module. The illuminance data is shown in Figure 10. As is shown in Figure 10, the measurement points were set at 0.5 M intervals starting at the laboratory wall. The blue curve shows the illuminance value measured by the illuminance acquisition module, and the red curve shows the illuminance value measured by the SPIC-200 illuminance meter. There is little difference in the illumination value and the root mean square error is 2.45 according to Equation (6). As is shown in Figure 10, the measurement points were set at 0.5 M intervals starting at the laboratory wall. The blue curve shows the illuminance value measured by the illuminance acquisition module, and the red curve shows the illuminance value measured by the SPIC-200 illuminance meter. There is little difference in the illumination value and the root mean square error is 2.45 according to Equation (6).
We also verified the dynamic response of the illuminance acquisition system in the range of 1.5-2 m from the light source by keeping the position of the light source fixed during the experiment. A SPIC-200 illuminance meter was used to measure the illuminance value at 1.5 m, and the maximum value was selected to save among the read values. In the next step, we used the illuminance acquired system to measure the illuminance several times at a distance of 1.5-2 m by way of sliding and recorded the maximum value of each measurement respectively. The error is determined due to dynamic measurements by calculating the interpolation of the two. The details are in Table 1. According to the data in the table, the error range for dynamic measurements is between −2.27% and −0.57%. which is within the standard range. Thus, the illuminance acquisition system can meet dynamic illuminance measurement requirements.

Illumination Acquisition Experiment
The UAV illumination acquisition experiment in this paper was chosen to be carried out in the sports stadium located at Dalian Polytechnic University. The school sports stadium is a versatile facility measuring 40 m in length, 20 m in width, and 10 m in height. It houses designated areas for badminton, basketball, and volleyball. The stadium is well-lit with evenly distributed overhead lighting, ensuring optimal visibility for athletes. And the illuminance measurement experiment was chosen to be carried out at night to avoid the influence of sunlight. The experimental scenario was built as shown in Figure 11. According to the data in the table, the error range for dynamic measurements is between −2.27% and −0.57%. which is within the standard range. Thus, the illuminance acquisition system can meet dynamic illuminance measurement requirements.

Illumination Acquisition Experiment
The UAV illumination acquisition experiment in this paper was chosen to be carried out in the sports stadium located at Dalian Polytechnic University. The school sports stadium is a versatile facility measuring 40 m in length, 20 m in width, and 10 m in height. It houses designated areas for badminton, basketball, and volleyball. The stadium is welllit with evenly distributed overhead lighting, ensuring optimal visibility for athletes. And the illuminance measurement experiment was chosen to be carried out at night to avoid the influence of sunlight. The experimental scenario was built as shown in Figure 11. As the stadium floor is yellow, green illuminance collection points were chosen to be arranged on the ground in order to facilitate the accurate identification of measurement points during the UAV illuminance collection process. According to the stadium illuminance standard, the point was set up at 2 m intervals, and illuminance measurement was carried out according to the center-point method. In addition, during the experiment, Figure 11. Experimental scenario construction.
As the stadium floor is yellow, green illuminance collection points were chosen to be arranged on the ground in order to facilitate the accurate identification of measurement points during the UAV illuminance collection process. According to the stadium illuminance standard, the point was set up at 2 m intervals, and illuminance measurement was carried out according to the center-point method. In addition, during the experiment, the illuminance measurement area stadium lamps were arranged as shown in Figure 12. In the illumination acquisition experiment, the UAV flew at a constant speed of 1 m/s. The position of the UAV was determined based on the fact that the UAV could record and return the current flight altitude in real time, and the distance traveled in a single flight. It can be determined whether the drone is above the measurement point or not based on the live feed from the vertical head of the drone. The drone's live image feed is shown in Figure 13. In the illumination acquisition experiment, the UAV flew at a constant speed of 1 m/s. The position of the UAV was determined based on the fact that the UAV could record and return the current flight altitude in real time, and the distance traveled in a single flight. It can be determined whether the drone is above the measurement point or not based on the live feed from the vertical head of the drone. The drone's live image feed is shown in Figure 13. Although the illuminance measurement experiments are carried out indoors, the drone recorded the flight data during the experiment, including flight height, flight distance, and flight speed. The UAV flight log is shown in Figure 14. Although the illuminance measurement experiments are carried out indoors, the drone recorded the flight data during the experiment, including flight height, flight distance, and flight speed. The UAV flight log is shown in Figure 14.

Illumination Data Processing
In the process of drone illumination acquisition, the drone hovers over the illumination collection point for 3 min and transmits the collected illumination to the OneNET cloud platform in 3-s intervals. The cloud platform data is shown in Figure 15.

Illumination Data Processing
In the process of drone illumination acquisition, the drone hovers over the illumination collection point for 3 min and transmits the collected illumination to the OneNET cloud platform in 3-s intervals. The cloud platform data is shown in Figure 15. The platform is showing an illuminance value of 542 lx at the moment. Furthermore, two switches are displayed in the cloud platform, one indicating whether the system is currently in a data collection state and the other controlling whether illumination collection is on or off. Based on the data collected, the acquired illuminance data can be plotted as a real-time variation curve. In the line graph, the horizontal axis is the time of illuminance acquisition, and the vertical axis is the current illuminance value. The curve acquires the current measured illuminance value at a refresh rate of every 3 s. In addition, the current illuminance collection height can be set into the illuminance collection system via the cloud platform. Illuminance at 4 M height in the stadium was acquired by this method. The UAV is flown to the designated measurement point and hovered for 3 min to get the illuminance data of the point, in order to calculate the illuminance data of the measurement point more accurately. The illuminance data for one measurement point is shown in Table 2. The platform is showing an illuminance value of 542 lx at the moment. Furthermore, two switches are displayed in the cloud platform, one indicating whether the system is currently in a data collection state and the other controlling whether illumination collection is on or off. Based on the data collected, the acquired illuminance data can be plotted as a real-time variation curve. In the line graph, the horizontal axis is the time of illuminance acquisition, and the vertical axis is the current illuminance value. The curve acquires the current measured illuminance value at a refresh rate of every 3 s. In addition, the current illuminance collection height can be set into the illuminance collection system via the cloud platform. Illuminance at 4 M height in the stadium was acquired by this method. The UAV is flown to the designated measurement point and hovered for 3 min to get the illuminance data of the point, in order to calculate the illuminance data of the measurement point more accurately. The illuminance data for one measurement point is shown in Table 2. As shown in Table 2, The measurement system got the illuminance data at 3-s intervals and transmitted the collected data to the cloud platform. The illumination acquisition system takes 120 s to get data at the measurement point, for a total of 60 data. Then, the outliers in the illuminance data at the collection points can be rejected, using the thresholds determined by the LOF algorithm. The rejection of outliers at measurement points is shown in Figure 16.
Appl. Sci. 2023, 13, x FOR PEER REVIEW As shown in Table 2, The measurement system got the illuminance dat intervals and transmitted the collected data to the cloud platform. The illum acquisition system takes 120 s to get data at the measurement point, for a total of Then, the outliers in the illuminance data at the collection points can be rejected, u thresholds determined by the LOF algorithm. The rejection of outliers at measu points is shown in Figure 16. The horizontal axis represents the time of acquisition, and the vertical axi illuminance value. It can be drawn that outliers marked in red are selected obvio the same time, the others are judged to be normal values and are marked in analyzing the images. In addition, it can be seen that the distribution of normal v concentrated with an error of around 50 lx. By calculation, the average value of 5 with the outliers removed is the illuminance value for that point.
Based on the data obtained from the illuminance measurement-co experiment, the horizontal distribution of illuminance at a height of 4 m above level was explored, as shown in Figure 17. The horizontal axis represents the time of acquisition, and the vertical axis is the illuminance value. It can be drawn that outliers marked in red are selected obviously; at the same time, the others are judged to be normal values and are marked in blue by analyzing the images. In addition, it can be seen that the distribution of normal values is concentrated with an error of around 50 lx. By calculation, the average value of 550.69 lx with the outliers removed is the illuminance value for that point.
Based on the data obtained from the illuminance measurement-collection experiment, the horizontal distribution of illuminance at a height of 4 m above ground level was explored, as shown in Figure 17. As shown in Figure 17, the distribution of illumination in the stadium becomes more complex as the height increases. On the horizontal plane, the distance from the starting point of the measurement in the eastern and western directions is used as the X and Y axes, respectively; the illuminance value is used as the Z axis, and the illuminance collection data is interpolated using cubic spline interpolation to make it easier to explore the trend of illuminance changes. Illuminance distribution at 4 m height in the stadium is plotted, and the image shows that the illuminance value rises significantly directly below the luminaire and falls significantly between the luminaires as the height increases.

Discussion
In this paper, we have built a stadium illuminance measurement system, which takes advantage of the convenience of the UAV, combined with a widely used optical-flow positioning function, an illumination sensor, a distance measurement module, a Wi-Fi data transmission, and a cloud platform. Based on the real-time feedback data from the mobile-phone app, the UAV position information is corrected, and the location of the illumination acquisition point can be located in combination with the real-time image return from the drone. Faced with the problem of inaccurate measurements caused by the unstable hovering of UAVs during the measurement process, we used the LOF algorithm to detect outliers in the acquired data. In this way, the stability of the illuminance measurement system can be checked, and, at the same time, the acquired data can also be detected in time for large fluctuations. Then, the data collected by the system can be displayed in real-time via the cloud platform, and the system acquisition process can also be controlled via the cloud platform. On the one hand, it is possible to judge the data collection of the UAV illumination collection acquisition system in time. On the other hand, it is convenient to control the system until the system is accurately located.
In the next step of research, the measurement system will be further improved to measure vertical illuminance under the different environmental requirements of the sports stadium and to explore the detailed light-flow distribution and three-dimensional representation of illuminance in sports stadiums.

Conclusions
Illuminance measurement in stadiums usually has to deal with large measuring fields, high measuring heights, and complex measuring fields. The UAV illumination measurement system proposed in this paper provides an effective solution to this problem, and it has been experimentally verified to have good stability and accuracy, with the root mean square error of the illuminance data acquired by the illuminance system and the illuminance standard value is 2.45, and the error range for dynamic measurements As shown in Figure 17, the distribution of illumination in the stadium becomes more complex as the height increases. On the horizontal plane, the distance from the starting point of the measurement in the eastern and western directions is used as the X and Y axes, respectively; the illuminance value is used as the Z axis, and the illuminance collection data is interpolated using cubic spline interpolation to make it easier to explore the trend of illuminance changes. Illuminance distribution at 4 m height in the stadium is plotted, and the image shows that the illuminance value rises significantly directly below the luminaire and falls significantly between the luminaires as the height increases.

Discussion
In this paper, we have built a stadium illuminance measurement system, which takes advantage of the convenience of the UAV, combined with a widely used optical-flow positioning function, an illumination sensor, a distance measurement module, a Wi-Fi data transmission, and a cloud platform. Based on the real-time feedback data from the mobilephone app, the UAV position information is corrected, and the location of the illumination acquisition point can be located in combination with the real-time image return from the drone. Faced with the problem of inaccurate measurements caused by the unstable hovering of UAVs during the measurement process, we used the LOF algorithm to detect outliers in the acquired data. In this way, the stability of the illuminance measurement system can be checked, and, at the same time, the acquired data can also be detected in time for large fluctuations. Then, the data collected by the system can be displayed in real-time via the cloud platform, and the system acquisition process can also be controlled via the cloud platform. On the one hand, it is possible to judge the data collection of the UAV illumination collection acquisition system in time. On the other hand, it is convenient to control the system until the system is accurately located.
In the next step of research, the measurement system will be further improved to measure vertical illuminance under the different environmental requirements of the sports stadium and to explore the detailed light-flow distribution and three-dimensional representation of illuminance in sports stadiums.

Conclusions
Illuminance measurement in stadiums usually has to deal with large measuring fields, high measuring heights, and complex measuring fields. The UAV illumination measurement system proposed in this paper provides an effective solution to this problem, and it has been experimentally verified to have good stability and accuracy, with the root mean square error of the illuminance data acquired by the illuminance system and the illuminance standard value is 2.45, and the error range for dynamic measurements is between −2.27% and −0.57%. In addition, this paper also explored the distribution of illuminance at different heights in the stadium, and the distribution of illuminance at high levels is mapped to provide a better understanding of the light conditions in the stadium. Informed Consent Statement: Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the patient(s) to publish this paper.

Data Availability Statement:
No new data were created or analyzed in this study. Data sharing is not applicable to this article.