Towards smart sustainable cities using Li-Fi technology: geo-location infrastructure utilizing LED street lights

Background Cities are shifting toward providing more efficient services and the Internet of Things (IoT) becoming the future of things. The shift toward using eco-friendlier LED lights in lighting cities is another genuine game-changer in the future of Light Fidelity technology (Li-Fi). Li-Fi is a visible light communication (VLC) technology that uses Light Emitting Diodes (LED) bulbs for communication. The utilisation of thousands of light sources around a city acting as wireless access points and delivering location-based content will shift cities towards being smart sustainable cities. Recently, this technology got huge attention from the research community and different research has been conducted to improve this filed. However, there is a noticeable need to develop real-world systems that utilise Li-Fi technology. Methods This article aims to contribute to developing a Geo-Li-Fi system that uses LED lights to provide the services for collecting contextual data and delivering location-based services (LBS) in different areas of the city. The system is described along with details of its design, implementation and development. Moreover, the overall set-up of the testbed that used to evaluate the proposed system is presented. In addition, an experiment is conducted using a real-world scenario to test the functionality of the system and report the outputs. Results The effect of the system is discussed according to different aspects of sustainability which include economic, social and environmental aspects. The system was tested in indoor and outdoor environments, and it can be seen that the sunlight does not affect the ability of LEDs to deliver the content during the daytime. Regarding the transmission range of the LED lamp, it can be seen that it is affected by different factors. It depends mainly on the power of lamp, so it will be increased significantly when the power of LED is increased. Also, an increase in the beam angle will result in wider coverage area which affected by the intensity.


a. WEBSITE
There are four groups of components that have been tested individually in the previous section (Unit Testing). These groups are: group 1, which contains general functions, group 2 includes the region functions, group 3 contains content functions and group 4 contains LEDs functions. In this section, we integrate the components one by one for each group and test the integration of the components to make sure it works as expected.

b. APPLICATION
We applied the integration testing in the application by the following steps:

REGRESSION TESTING
Regression testing is a type of software testing carried out to ensures that previously tested code is still performs the way it should be after new changes.
Changes may include enhancements, patches, configuration changes.
We applied the regression test by testing the whole functionalities of the system after any change or update, to make sure that the system functionalities are still working as expected.

Enter invalid email
The user enters an email in an incorrect format.

Click on Register button
An error message will be displayed prompting the user to enter a valid email

Enter valid inputs
The user enters valid inputs Click on Log in button The user will be redirected to the dashboard.

Actual Result
Pass?

Yes
Enter invalid email or password The user enters invalid inputs

Click on Log in button
An error message will be displayed prompt the user to enter valid inputs

Actual Result
Pass?

Manage Content Test Cases
These test cases show the different scenarios that a user may experience when trying to add or update a content in the system.

Manage Content Test Cases
Test Case: #1 Description Action Expected Result

Enter valid inputs
The user enters valid information Click on Add / Update content The content will be added/updated and a success message will be displayed.
If add: the user will be redirected to all content page. If update: The user will be redirected to view content page.

Pass?
Yes

Manage Content Test Cases
These test cases show the different scenarios that a user may experience when trying to create or update a region in the system.

Manage Region Test Cases
Test Case: #1 Description Action Expected Result

Enter valid inputs
The user enters valid information Click on Create / Update region The region will be added/updated and a success message will be displayed.
If add: the user will be redirected to all regions page. If update: The user will be redirected to view region page.
Actual Result(Add)

Pass?
Yes

Manage LEDs Test Cases
These test cases show the different scenarios that a user may experience when trying to add or update a LED information.

Leave empty fields
The user leaves some or all fields empty Click on Add / Update LED Error messages will be displayed prompt the user to complete empty fields.

Actual Result(Add)
Pass?

Exceed the accepted number of characters in LED
The user enters more than 100 character in region description field Click on Add / Update LED LED description field will be blocked and its border color will be changed to red

description field
Actual Result (Add)

Pass?
Yes

View (Region, content and LEDs) View Test Cases
Test Case: #1 Description Action

Delete (Region, Content or LED)
The user clicks on delete icon to delete a region/content/LED Click on delete icon

Expected Result (Delete LED)
A confirmation message will be displayed, if the user presses Yes, then the region will be deleted, and a success message will be displayed. If the user presses No, then the system will display the region A confirmation message will be displayed if the user presses Yes, then the content will be deleted, and a success message will be displayed. If the user presses No, then the system will display the content A confirmation message will be displayed if the user presses Yes, then the LED will be deleted, and a success message will be displayed. If the user presses No, the system will display the LED Actual Result (Delete a Region)

Pass?
Yes The user clicks on edit icon of an item Click on edit icon of an item

Expected Result (Region) Expected Result (Content) Expected Result (LED)
The user will be redirected to update region page The user will be redirected to update content page The user will be redirected to update LED page Actual Result (Region)

Pass?
Yes   A confirmation message will be displayed if the user presses Yes then the selected regions will be deleted, and a success message will be displayed. If the user presses No, then the system will display All-regions page.

Actual Result
Pass? An error message will be displayed prompt the user to select at least one region

Actual Result
Pass?

Click on link button
The user clicks on a link button of a content Click on a link button The content will be linked with the selected LED and the link button will be changed to Unlink

Actual Result
Pass?

Click on Unlink button
The user clicks Unlink button of a LED

Click on Unlink button
A confirmation message will be displayed, if the user clicks on Yes, then the content will be unlinked from the selected LED and the unlink button will be changed to link. If the user clicks on No, then the system will display link content page.

Actual Result
Pass?  An error message will be displayed prompt the user to enter a valid email

Actual Result
Pass?

Enter incorrect password
The user enters incorrect password.

Click on Reset Password
An error message will be displayed prompt the user to enter correct password

Actual Result
Pass?

Enter invalid email
The user enters an email in an incorrect format.

Click on Register button
An error message will be displayed prompting the user to enter a valid email.

Test case#4 Description Action Expected Result Actual Result
Pass?

Enter email that is already registered
The user enters a registered email

Click on Register button
An error message will notify the user that the email is already registered.

Click on Timeline (not in range of a LED)
The user clicks on Timeline when he isn't in a range of a LED

Click on Timeline
An error message will be displayed indicating that there is no Li-Fi signals found.

View favorite list (logged in)
The user clicks on Favorite when he is logged in.

Click on Favorite
The user's favorite list will be displayed.

Test case#2 Description Action Expected Result Actual Result
Pass?

View favorite list (not logged in)
The user clicks on Favorite when he isn't logged in.

Click on Favorite
An error message will be displayed prompting the user to log in to view his favorites list.

Test case#3 Description Action Expected Result Actual Result
Pass?

Remove content from favorite list
The user delete a content from his favorite list Click on heart icon of a content The content will be removed from the list.

Enter valid inputs
The user enters valid inputs

Click on Share button
The content will be linked with the LED, and the user will redirected to the Timeline Yes

a. Test results
During the run time of 15 min, we found that highest memory usage was 194.8MB, the lowest is 72MB and the average was 150MB. From these results we can see that the RAM does not exceed 10% of the mobile app so, we can safely say that the application is light on the devices memory. Also We tried to run the application in parallel with others apps and this does not affect in the app performance. In addition, we tried to perform all application functionalities and the application never crashed.
Finally, the app gave a realistic user experience with a slow Internet connection, the Figure 1:application performance app receives contents but, the only multimedia contents take more time to appear on the screen. From the previous result we found that our application passed the performance testing.

STRESS TESTING
Stress testing determines the upper limits and the sizing of infrastructure by causing the application or its supporting infrastructure to fail. It is hard to perform stress testing manually, since it is difficult to simulate multiple user inputs and interrupts fast enough to strain the system. However, with the use of tools it can simulate these test scenarios easily. To perform stress testing, Neoload tool is used. Neoload is an automated test tool for measuring applications and web and API performance. Using Neoload tool, two tests have been applied to measure our application's performance.

a. First result
It can be seen in Figure 107, that our application has passed the first test, under the user of 20 users within two minutes. 20% of the users used Motoral X android level 6.0, 30 % used Samsung Galaxy S7 android level 6.0 and the last 50% used Samsung Galaxy S6 Android level 5.0.

Figure 2 : first stress test result
As we can see in Figure 108, the application handled a heavy load. The number of users have been increased from 20 to 40 active users, and the duration time has been increased to 15 min. The application showed no signs of bugs, or memory leaks and did not crash from the load.

USER ACCEPTANCE TESTING
The objective of the user acceptance testing is to confirm that the system under test meets its requirements and to provide confidence that the system works correctly and measure its usability before it is delivered to the end users [54].
The usability of the system was measured in three criteria: • Effectiveness: by measuring the number of errors detected when the user performs a specific function.
• Efficiency: by measuring the time that the user takes to perform a specific function.
• Satisfaction: by using a survey to discover the users' feedback about the system.
The total number of users who participated in the acceptance testing are ten users: five of them tested the website and the other five tested the application. To test the efficiency and effectiveness of our system, the average time the users need to complete a specific task was computed and the average numbers of errors they have made was registered. The results are shown in Table.106 for the website and in Table.109 for the application.
To measure the user satisfaction of the system, a survey was made that measures different aspects of the system such as ease of use and learnability, feedback and errors, consistency and screen displays, efficiency and subjective satisfaction. The questions were constructed as seven-point rating scales. Users were asked to rate agreement with the statements, raging from strongly disagree to strongly agree. The questions were chosen based on several HCI rules measuring usefulness, reliability, usability, consistency, learnability, robustness and satisfaction. The questions included in the survey are as appears in Table 104.    the "add form", due that, the number of seconds taken to add a region (53 seconds) is relatively higher than updating it (23 seconds). While performing the test, we noticed that the number of seconds taken by all the test users to accomplish the same task is almost the same. From this we conclude that the easiness of use of the system is in the same level for different types of user, regardless of their skills and background.

Ease of Use and Learnability
Managing LEDs time taken dropped from approximately 128 seconds in "Add" to 33 seconds in "Update". Which means once the user got familiar with the interfaces the easiness of use increase.
It can also be seen that the number of errors encountered while completing any if the tasks is almost zero. This indicates the level of ease and learnability of the system.
As mentioned earlier, the user must fill a survey after testing any of the subsystems.
Some of the significant survey results regarding the admin web-panel appear in the tables below.  Regarding the results of the survey, the majority of users "strongly agree" that the website is easy to use, requires the fewest steps possible to accomplish the tasks and they can easily remember how to use it. This relates to the results obtained from measuring time taken to accomplish the tasks. Consistency plays an important rule to gain these results, as it entails the ease of use, learnability and familiarity. It can be seen that 3 out of 5 users "strongly agree" that they can recover from mistakes quickly and easily. This sentence concerns how the system notifies the user about incorrect input, missing fields and unintended actions. Add to that, 4 out of 5 users "strongly agree" that the messages provided to them in case of mistakes are meaningful and jargon free. When considering the screen displays style and consistency among windows, all of the user "agree" that the choice of screen and font colors is suitable, widgets locations and colors are consistent across displays, wording is consistent across displays, icons and symbols reflects the intended tasks. That verifies the result concluded from analyzing the obtained measurement in Table 106, which is the acceptable level of consistency achieved throughout the pages.

Ease of Use and Learnability
However, most users reported that guidance information is not always available in all pages which may highlight the need to provide more guidance on how to use the system. Finally, 80% of users overall satisfied with it.

APPLICATION TESTING
The   The table above shows the average time that users need to complete each function in the application and the average numbers of errors they have made in each function.
It can be seen from the table that the average number of error occurred in upload content is 0.2, while it's 0 in all the other functions. That could refer to the fact that uploading content requires more steps than other tasks. The upload content function took 29.4 second on average. In addition, view users and admin content took between 4 to 5.2 second on average, and the related functions such as: save content to favorite list and view favorite list took between 3.3 to 3.31 second. Finally, view all regions and view specific region took between 2 to 2.2 second on average. Registering in the system required the highest number of seconds among the functions. When the user first downloads the application, the first function to interact with is registration.
According to that, it requires some time until the user gets familiar with how the application works; justifying the gained result. The other functions took noticeable low amount time to accomplish.
Regarding the results of the survey, most of users "strongly agree" that the application is easy to use. However, all of them "strongly agree" that it requires the fewest steps possible to accomplish the tasks, and they can easily remember how to use it; confirming learnability and ease of use of the application. Concerning robustness of the application, it can be seen that most of users "agree" that they can recover from any mistakes they make quickly and easily and the the provided messages are meaningful and jargon free. 90% of users "strongly agree" that the choice of screen and font colors is suitable, widgets locations and colors are consistent across displays, wording is consistent across displays and icons and symbols reflects the intended task. Furthermore, most users "strongly agree" that the application does everything they expect it to do, and shifting among windows is easy.
However, 4 out of 5 users reported that guidance information is not always available.
Moreover, most of users "strongly agree" that it works as the way they want it to work, and that it is designed for all levels of users. Finally, 80% of users overall satisfied with the application.