Next Article in Journal
The Power of Data: How Traffic Demand and Data Analytics Are Driving Network Evolution toward 6G Systems
Previous Article in Journal
Characteristic-Mode-Analysis-Based Compact Vase-Shaped Two-Element UWB MIMO Antenna Using a Unique DGS for Wireless Communication
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Testbed Facilities for IoT and Wireless Sensor Networks: A Systematic Review

Institute of Electronics and Computer Science, 14 Dzerbenes St., LV-1006 Riga, Latvia
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
J. Sens. Actuator Netw. 2023, 12(3), 48; https://doi.org/10.3390/jsan12030048
Submission received: 21 April 2023 / Revised: 7 June 2023 / Accepted: 7 June 2023 / Published: 15 June 2023

Abstract

:
As the popularity and complexity of WSN devices and IoT systems are increasing, the testing facilities should keep up. Yet, there is no comprehensive overview of the landscape of the testbed facilities conducted in a systematic manner. In this article, we provide a systematic review of the availability and usage of testbed facilities published in scientific literature between 2011 and 2021, including 359 articles about testbeds and identifying 32 testbed facilities. The results of the review revealed what testbed facilities are available and identified several challenges and limitations in the use of the testbed facilities, including a lack of supportive materials and limited focus on debugging capabilities. The main contribution of this article is the description of how different metrics impact the uasge of testbed facilities, the review also highlights the importance of continued research and development in this field to ensure that testbed facilities continue to meet the changing needs of the ever-evolving IoT and WSN domains.

1. Introduction

The concept of the Internet of Things (IoT) appeared more than two decades ago [1], but nowadays, it has become a fundamental field of research and left a huge impact on the industry since its first presence. In 2022, the number of connected IoT devices reached 14.4 billion globally, according to IoTAnalytics [2]. The design and quality of an IoT system may be improved through a testbed facility where the system is tested, tuned, and calibrated prior to deployment in the real world [3]. This paper provides a comprehensive review of such testbed facilities.
The IoT paradigm connects the three elements of actuators, sensors, and environment intelligently, for the sake of improving the efficiency and real-time capabilities of certain processes [4]. The IoT is currently involved in quite many modern applications such as Intelligent Transportation Systems, Building Management Systems and smart homes, agriculture, industrial automation, healthcare, consumer services, and many other fields [5]. Although the IoT could be utilized in a huge variety of applications, some deployments could be tightly made to serve a very specific application, for example, indoor localization [6] or body coupled communication [7], without any room for design flexibility.
Due to the significance of sensor data collection and exchange in IoT applications, the core element and key enabler of IoT applications is the Wireless Sensor Networks (WSN). The WSN usually consists of spatially distributed low-power sensor nodes, backed by an architecture that mainly depends on efficient communication protocols which consider: energy efficiency, scalability, satisfactory Quality of Service, and improved data transmission [8]. In order to allow a broad spectrum of applications to be deployed besides avoiding design rigidity, WSN testbeds are the optimal solution for prototyping IoT applications. In most cases, WSN testbeds better represent the real application conditions compared with simulation tools [4]. For the sake of attaining the main goal of performing diverse experiments, the WSN testbed design needs to balance between the controllability of laboratory environments and the closeness to real-world conditions. Accordingly, the specifications of the WSN testbeds introduce a set of challenges or problems during its design steps, such as architectural challenges such as scalability or the possibility of upgrading to new hardware, hardware challenges, and software challenges such as the design of testbed front-end and back-end [9].
The phrase “testbed” refers to a variety of uses in general literature, ranging from a single-unit testbench to full-scale 24/7 available testing facilities with advanced features such as power profiling, etc. The device used for experiments is called the device under test (DUT) and typically is a separate device from the testbed infrastructure. In this article, we are using the term “testbed facility” and have placed the following criteria to qualify as a minimum viable product for a testbed facility:
  • Capable of running a variety of different experiments, where the DUTs are completely controlled by the user;
  • Any interaction with the hardware and software can be performed remotely;
  • Has a user interface specifically designed for testbed facility purposes. Having only Secure Shell Protocol (SSH) or Virtual Private Network (VPN) and similar solutions for access to testbed devices is insufficient.
This systematic review was designed to understand how different aspects and functionality impact the usage of the testbed facility and what are the gaps that are not yet covered by the existing testbed facilities, originating from the fact that we are developing and maintaining our testbed facility. The research question of this article is following: How do different metrics impact the usage of testbed facilities? We chose to investigate the following metrics: DUT type, provided sensors, access level, user interface, assistive tools, architecture, cost of implementation, open source, facility count, DUT connection interfaces, DUT interaction interfaces, DUT locations, DUT count, availability, geographic locations, and power monitoring.
The ultimate goal of this review article is to provide the reader with a solid, comprehensive application-agnostic summary of the available testbed facilities that were published during the last decade up to the year 2021. To accomplish the goal of the document to be a guide for end-users and developers of WSN testbed facilities, the landscape of the available facilities had to be surveyed and deeply analyzed. To the best of the authors’ knowledge, most of the available survey and review articles usually do not involve analysis and developer or authors’ insights of the reviewed testbed facilities, which is mainly targeted in this document.
The rest of the article is structured as follows: Section 2 describes surveys on the domain of testbeds, Section 3 briefly describes the used methodology to gather, codify and analyze the dataset, with the broader description provided by the published data set article by Judvaitis et al. [4], Section 4 presents the analyzed data concerning the presented research question together with author inference, Section 5 discusses existing challenges and possible future work in the testbed facility domains, and the conclusions are presented in Section 6.

2. Related Work

A technical survey of wireless sensor network platforms was published in 2008 by Omiyi et al. [10]. A large section of the article is devoted to the specifications of the available sensor nodes at that time, such as microcontrollers, radio transceivers, sensors, and cost. Six sensor nodes were compared due to their specifications, examples of these nodes were MSP430, TmoteSky, and EyesIFXv2. Seven testbeds were discussed in the subsequent section, five of them were located in the US and two in Europe. The testbeds were outlined in terms of their scope, size, key functions, architecture, and wireless technology used.
In 2010, a survey on simulators and testbeds for Wireless Sensor Networks was published by Imran et al. [11]. The paper briefly discusses around 20 performance evaluation tools of Wireless Sensor Networks as simulators, emulators, and testbeds, afterwards, seven testbeds were briefly mentioned and described.
In 2011, the survey published by Steyn and Hancke [12] briefly described and classified 23 testbeds, according to their features. Examples of the classification features decided by the authors were: server-based control, single PC-based control, multisite testbed, and in-band management traffic testbeds.
The Multimedia variant of the Wireless Sensor Networks (WMSN) facilities was defined and briefly discussed by Farooq and Kunz [13]. Six WSN facilities were discussed, one paragraph for each facility, along with five standard Wireless Sensor Network facilities which were reported in more detail and compared according to the number of available DUTs, software/hardware heterogeneity, and deployment scale.
Horneber and Hergenröder [14] surveyed testbeds and experimentation environments for Wireless Sensor Networks, with the main purpose of discussing the design decisions taken during the process of testbed development. Forty Sensor Network testbeds, three Sensor Network testbed federations, six Wireless Networking testbeds, and seven testbed-related tools were listed and used as examples for the discussion of design decisions. The second part of the article evaluates some of the mentioned facilities in simulated/emulated environments with respect to Wireless Sensor Network research topics. Specific evaluation aspects were energy efficiency, sensor node mobility, localization infrastructure, target environments, and scalability.
Another survey on static and mobile wireless sensor network experimentation testbeds was published by Tonneau et al. [15] in 2014. The authors surveyed ten stationary testbeds and five mobile testbeds, based on categories of experimentation, hardware features, maintenance, and mobility. In 2015, the same group of authors Tonneau et al. [16] extended the preceding survey by investigating features provided by testbeds in more detail, in order to assist the user in choosing a testbed according to their experiment requirements and conditions.
The survey published by Ma et al. [17] briefly describes a number of challenges facing experimental research using Wireless Sensor Network testbeds, subsequently, the authors enumerate the recent efforts of testbeds designed to face these challenges. For instance, some of the challenges mentioned by the authors are high-precision observation, reproducible environmental factors, and federation testing. The authors presented and evaluated eight testbeds in total, two per each challenge.
In 2018, Zhou et al. [18] reviewed the testing of Cyber-Physical Systems (CPS) and investigated several methods and testbeds. The authors elaborated on CPS testing methods and reported the taxonomy of Method-based CPS testing and online monitor tools for CPS in tables. The testing methods were categorized by their paradigm and underlying techniques. Various types of CPS testing were discussed in detail such as model-based testing, search-based testing, fault injection for CPS testing, big data-driven testing, and conformance testing along with other types of tests. Furthermore, the authors carried out architecture and function analysis, by reporting the taxonomy for several existing CPS testbeds, communication infrastructure, and simulation tools. The article also provides some challenges for future complex CPS testing methods, such as uncertainty modeling, state space explosion, and real-time CPS testing assurance.
In a systematic review in 2022 published by Judvaitis et al. [19] and based on a dataset by Judvaitis et al. [20], the authors discussed the usage of testbeds in sensor network deployments. Only a small percentage (17.75%) of deployments used testbeds in the observed five-year period from 2013 to 2017, suggesting there is room for improvement and increased usage. The authors noted that there is no noticeable trend in the usage of testbeds over the years, but the data show that testbeds are typically used for medium-scale deployments and when targeting Technology Readiness Level 5 (TRL5) or when researching the sensor network itself. Wireless sensor networks are more likely to be tested in a testbed when compared to wired or hybrid networks. The usage of testbeds is higher for industry-relevant fields such as communication, industry, infrastructure, and transport, indicating that there is a niche for testbeds to occupy in helping with sensor network deployments in real-world situations where complexity is greater. Additionally, interactive deployments tend to use testbeds more than passive deployments, suggesting that testbeds may be used as a means to curb the complexity. The article also suggests that there is a need for additional research on mobile testbeds, which are currently underutilized, and specific testbeds for specific goal networks such as urban, outdoor, or underwater environments.
To the best of the authors’ knowledge, there are no existing systematic reviews on the existing and available testbed facilities, general-purpose testbed facilities review articles are a scarce resource, conversely, review articles usually focus on specific domains, such as [21] and agriculture [22].

3. Methodology

In order to perform a comprehensive review of the available testbed facilities, we decided to include the testbed facilities introduced and published in the scientific literature between the years 2010 and 2021. The flow of our review work consists of two primary stages, namely, data acquisition and data analysis. The used data set is available publicly and is described by Judvaitis et al. [4]. In the following section, we provide a brief overview of the data acquisition and analysis.

3.1. Data Acquisition

Data acquisition is the extraction of metadata from scientific articles, which are potentially describing a testbed facility. Our criteria for the extraction of metadata for the systematic review were defined as indexed publications by the citation databases of SCOPUS or Web of Science (WoS) in the English language, which are authored by the creators/developers of the testbed facilities. The utilized query syntax for SCOPUS and WoS databases was:
  • SCOPUS: Scopus: TITLE ( testbed ) AND TITLE-ABS-KEY ( wsn OR iot OR “sensor network*” OR “internet of thing*” ) AND SUBJAREA ( comp ) AND PUBYEAR < 2021 AND PUBYEAR > 2010
  • WoS: TI = (“testbed”) AND (AB = (wsn OR iot OR “sensor network*” OR “internet of thing*”) or AK= (wsn OR iot OR “sensor network*” OR “internet of thing*”)) and SU=“Computer Science” and py =(2011–2020)
The executed queries on both databases resulted in 522 articles, thereafter, 163 duplicated articles were removed, to qualify 359 articles for the screening step.
In the screening phase, where only the title and abstract were considered, 170 articles were discarded as not describing a testbed facility, leaving 189 articles to be investigated further. The steps of our systematic review are depicted as a flow diagram in the following Figure 1.
To determine the geographic location and availability of the testbeds, a systematic approach was undertaken. Depending on the type and specifications of the experiment to be held, the environment in which the testbed facility is established would potentially play a vital role. An example of that would be the variation of temperatures, pressure, altitude, and air quality from one geographic location to another. Accordingly, we decided to review and include the geographic location as a facility feature. There is a wide spectrum regarding the statement of geographic information for each facility, ranging from being clearly stated to not being mentioned at all. To identify the geographic location and availability of testbed facilities, we conducted a thorough search using the following step-wise approach:
  • Reviewing the homepage or git repository of the testbed facility, as this was the most reliable source of information, if available;
  • A Google search using the name of the testbed facility;
  • Lookup of the laboratory or group responsible for maintaining the testbed facility;
  • Comparison of the satellite image of the facility location to the scanned Google Maps satellite data of the institution area to obtain an estimation;
  • Examining the funding project homepage;
  • Using the address listed in the affiliation of the first author.
Through systematical analysis of these various sources of information and using the most accurate position obtained, we were able to obtain information regarding the geographic location and availability of the testbed facilities. If no information regarding the availability of the testbed facility was found through the above-mentioned steps, it was marked as ’not available’ (NA).

3.2. Data Analysis

In order to perform the systematic review for the testbed facilities obtained from the screening phase, first, we needed to ensure the eligibility of the described solution to be a testbed facility (Phase 2), then, each testbed facility needs to be characterized in terms of numbers, functionalities, architecture and other features (Phase 3).
The second phase is considered to be a complete text screening phase, in contrast to the first phase, which only focused on the titles and abstracts of articles. The outcome of the second phase was to exclude 125 articles out of 189 articles in total.
The third phase of analysis implies the characterization of testbed facilities due to a set of predefined values, which are determined by our criteria for classification as a testbed facility. We decided to segment the characterization of testbed facilities according to three different levels: Facility level, DUT level, and Supplementary level. During this stage, 19 more articles were excluded as not qualifying as describing a testbed facility.
The facility-level description includes a generic description of the facility’s geographic location, architecture, and workstations specifications, implementation cost, deployment options (support of outdoors deployment), facility count (either multiple or federated facilities), functionalities (features provided to the user), and access level (level of user control). The DUT level of characterization includes the number of DUTs, location accuracy (precision of DUTs’ locations), mobility (physical motion during experiments), DUT connection, and interactive interfaces. The Supplementary level includes assistive tools (any tools designed to improve the user experience), source code availability (e.g., open-source platform), and user interface (UI) type.
The three characterization segments of the testbed facilities were extracted and compiled by our team members, for each article separately, in a unified machine-readable data form of JSONForms https://jsonforms.io/ (accessed on 25 May 2023) for further investigation.

4. Results and Discussion

Table 1 lists the reviewed testbed facilities and includes the following information: name of the testbed facility, references to the main and any additional articles, and the properties of the testbed facility, such as the information about the DUT count, whether the testbed facility is active, and main features of the testbed facility. Multiple observations can be made from Table 1: (i) there are four testbed facilities with NA DUT count, this is because we were not able to identify the available DUT count from the articles, (ii) the testbed facility is more likely to still be active if the provided DUT count is comparatively higher, (iii) 69% of the testbed facilities have only a single article about them, while among those testbed facilities that have more than one article, there are seven articles with updates, five articles with demonstrations, and one abstract article. The testbed facility with the most articles is EDI TestBed by Ruskuls et al. [23], which has a main article and four update-related articles.
Figure 2 summarizes the main article publication years for testbed facilities, as visible, there are no major trends in the evaluated 10 year period, except for the peak around 2015 and 2016 and following dip at 2017 and 2018, but after that, for two consecutive years, the publication of articles about new testbed facilities continues at a steady pace.
The overview of the available DUTs and sensors is shown in Table 2. The following assumptions should be kept in mind while reading the table:
  • Any DUT that cannot be commercially purchased or whose type is not stated in the article is considered as Custom.
  • By Low-performance embedded devices (LPED), we understand MSP430-based and similar devices mainly intended for battery-powered deployments;
  • By High-performance embedded devices (HPED), we understand ARM A8, M3, M4, and similar DUTs with considerable computational power;
  • By Mobile devices (MD), we understand DUTs that are located on a mobile platform or are a mobile platform themselves;
  • By Single-board computers (SBC), we understand devices such as Raspberry Pi and similar, tablets, and smartphones.
  • For sensors and actuators, we use the following categories:
    IMU (inertial measurement unit): accelerometers, gyroscopes, and magnetometers.
    Acoustic: acoustic and noise sensors.
    Air quality: CO, CO2, dust, gas, smoke, etc.
    Presence: presence or proximity of an object or event, for example, door and window state, fire detection, car presence, etc.
    Location: GPS, line followers, hall encoders, etc.
    Environment: temperature, humidity, light, weather station, etc.
    Energy: energy consumption measurements of any third device.
    Actuators: interaction with the real world in any perceived way.
  • NA means that there is no information about the count of the devices or sensors, but it is mentioned that the testbed facility contains them.
An example of how to interpret the data from Table 2 for FIT IoT-LAB by Adjih et al. [63] is provided:
  • Devices Under Test:
    A total of 96 Custom, referred to in the article as generic host nodes.
    A total of 1144 LPED, referred to in the article as WSN430 open nodes.
    A total of 1488 HPED, referred to in the article as 938 M3 open nodes and 550 A8 open nodes.
    A total of 117 MD, referred to in the article as 85 Turtlebots and 32 Wifibots.
  • Sensors and actuators
    A total of 1488 IMU sensors, referred to in the article as:
    *
    A total of 938 three-axis (gyro, accel, magnetometer) attached to M3 open nodes, one per node.
    *
    A total of 550 three-axis (gyro, accel, magnetometer) attached to A8 open nodes, one per node.
    A total of 1023 presence sensors, referred to in the article as:
    *
    A total of 938 pressure sensors attached to M3 open nodes, 1 per node.
    *
    A total of 85 Microsoft Kinect sensors attached to Turtlebot, 1 per node.
    A total of 381 location sensors, referred to in the article as:
    *
    A total of 232 GPS sensors attached to M3 open nodes, not all nodes have this.
    *
    A total of 85 odometers attached to Turtlebot, 1 per node.
    *
    A total of 32 cameras attached to Wifibot, 1 per node.
    *
    A total of 32 hall encoders attached to Wifibot, 1 per node.
    A total of 4164 environment sensors, referred to in the article as:
    *
    A total of 1144 temperature sensors attached to WSN430 open nodes, 1 per node.
    *
    A total of 938 temperature sensors attached to M3 open nodes, 1 per node.
    *
    A total of 1144 light sensors attached to WSN430 open nodes, 1 per node.
    *
    A total of 938 light sensors attached to M3 open nodes, 1 per node.
The following section provides a detailed analysis of the testbed facility features that were analyzed. The following metrics contain only one value per testbed facility: Access level, Architecture, Cost of implementation, Open Source, Facility count, and DUT count. The remaining metrics can contain more than one value per testbed facility, depending on the context, for example, multiple deployments, etc., leading to a total amount that is larger than the total amount of testbed facilities observed in this systematic review.

4.1. Access Level

For 32 testbed facilities, we have extracted 39 data points about access level, as the testbed facility may provide various access levels, which we understand as the type of access or interface that is provided for the testbed facility users. The following types of access were found: User interface (UI), Application programming interface (API), Secure shell (SSH) access, and Virtual private network (VPN).
Mostly, testbed facilities tend to provide access through a dedicated user interface (27) specifically developed for this purpose, which is expected. A considerable number of testbed facilities (8) also provide API-based access, which would allow for automated and scripted interactions, promoting repeatability and providing an easier way of running numerous experiments for the scientific soundness of the research.
The only testbed facility where a VPN connection is used in conjunction with a graphical user interface is Integrated Testbed for Cooperative Perception by Jiménez-González et al. [41]–a testbed facility for cooperative experiments involving mobile robots and WSN. The VPN connection is used to secure communications and prevent potential uncontrolled and malicious remote access, as the additional layer of security provided by the VPN access is introduced because of the ability of testbed facility users to control several robot motion control functionalities such as low-level velocity control, local position control, trajectory following, etc.
The SSH access is used by four testbed facilities, and only one of them relies solely on an SSH connection.
FIT IoT-LAB by Adjih et al. [63] has a dedicated UI, but as an added benefit provides SSH access to the most powerful sensor node in this testbed facility equipped with a Cortex-A8 processor capable of running Linux or Android, this direct connection enables wider control over the hardware and is used for hardware-specific purposes and not general interaction with the testbed facility. ASNTbed by Dludla et al. [36] uses SSH connection as a temporary access model while the web-based UI is being developed, as the provision of SSH access is easy and fast. SensLab by Burin des Rosiers et al. [61] is the only testbed facility to use SSH as the primary and only planned access method, as it provides access to a remote virtual machine preconfigured with all the necessary tools needed for the user to interact with the testbed facility. Generally, SSH access is not widespread for testbed facilities as it generally poses too many security risks, yet it is still tempting because of the ability to provide complete control over the DUT with minimal effort.

4.2. User Interface

For 32 testbed facilities, we have extracted 41 data point about the provided user interface. The most popular way of user interaction provided by 16 testbed facilities is a graphical user interface (GUI), followed by a web interface (12). This suggests that these testbed facilities are more tended toward non-expert users preferring easy-to-use and understand interfaces. The usage of the command line interface (CLI) for user interaction is supported by 11 testbed facilities, this is a tradeoff between the ease of usage and friendliness to non-experts of the field and ease of usage for, so to say, professional use cases where the interaction with the testbed facility must be seamless and easy to integrate into an existing workflow and/or automation.
The only testbed facility to provide the user interface with a mobile app is the smart city testbed facility SmartSantander by Sanchez et al. [65] and it is used to provide the users with the ability to subscribe to the data streams and alerts relevant to their proximity. SmartSantander also provides a Participatory Sensing mobile app that enables the users to become part of the testbed facility by providing sensed physical measurements along with any users’ observations transmitted as text, images, or video. Along with the mobile application, the SmartSantander testbed facility also provides traditional user interaction possibilities by means of CLI or web-based interface.
An unusual outlier is another testbed facility using RESTful API as a user interface. It is the architecture proposed by Fully reconfigurable plug-and-play wireless sensor network testbed by Bekan et al. [38]. The RESTful API is used in a proof of concept reference implementation with the idea that when the proposed architecture is implemented into a fully-fledged testbed facility, the approach for the user interface is improved.

4.3. Assistive Tools

For 32 testbed facilities, we have extracted 32 data points about provided assistive tools, but this does not mean one-to-one correlation, only 18 testbed facilities have any sort of assistive tools. The fact that only a small part of testbed facilities provide a tutorial (six) or user manual (four) means that the rest of the testbed facilities are not intent on actively attracting new users or alleviating the learning curve. There are a total of eight distinct testbed facilities with some supportive material available, three of which contain both a tutorial and a manual.
To facilitate a deeper integration between the user code and the provided capabilities, five testbed facilities provide libraries for such integration. FIT IoT-LAB by Adjih et al. [63] provides wireless communication libraries with simple and useful APIs for MAC protocol implementations.
Only two testbed facilities provide drivers as assistive tools to the users. FIT IoT-LAB by Adjih et al. [63] has developed and is maintaining OS-independent drivers to give access through an API to all hardware modules on the DUT. This is particularly useful for testing the envisioned system on a set of heterogeneous devices because, in theory, the users do not need to worry about device-specific implementations of their code provisioning a faster implementation path with the trade-off of code portability, as it will only work on the hardware supported by the FIT IoT-LAB team. SensLab by Burin des Rosiers et al. [61] also mentions the existence of drivers ready to be used with the SensLAB testbed facility, unfortunately, no additional details about this integration are given in the articles.

4.4. Architecture

For almost half of the testbed facilities (15) the articles do not mention what kind of backend architecture for interconnection is used. The most popular architecture mentioned in 15 testbed facility articles is the Star topology, where there is one central server with multiple workstations connected to it directly or through aggregators.
The only testbed facility to use mesh as a backend networking topology is SmartSantander by Sanchez et al. [65], where the used sensor nodes are provisioned to provide two separate communication channels, one is used for the experimentation by the users and the other provides the data channel for management and service functionality. The management and service communication channel use both single-hop and multi-hop to transfer data to the gateway and server via Digimesh-enabled (https://www.digi.com/products/browse/digimesh (accessed on 25 May 2023)) radio interface.
The only testbed facility to use a tree topology for the backend communications is the flexible and low-cost testbed facility FIST by Guo et al. [26], it relies solely on the wireless connections provided by the TelosB sensor nodes and manages to use the same radio channel for the control of the sensor nodes as well as the needs of the experiment itself. This is achieved by dividing the software into two separate parts—Testbed Program Space and User Experimental Program Space, both of them coexist in the TelosB sensor nodes.

4.5. Cost of Implementation and Open Source

Generally, articles about testbed facilities do not provide any details about the cost of implementation, only two facilities have provided the cost for the workstation and DUT, respectively, EUR 158 for Indriya by Doddavenkatappa et al. [53] and EUR 476 for OpenTestBed by Munoz et al. [45]. The OpenTestBed has also provided a cost for the whole implementation of the testbed facility which amounts to EUR 9480. Overall, the data about the cost of implementation are insufficient to provide any meaningful insight into the associated cost of developing a testbed facility.
Generally speaking, the cost of implementation alone provides little value for someone who would like to build their own testbed facility, only together with the documentation and possibly open-source implementation details for software and hardware would allow interested developers/users to reproduce the testbed facility. From the aforementioned testbed facilities, only the Indriya implementation is fully open-source, while OpenTestBed only published the hardware implementation. Altogether, only nine testbed facilities have provided their implementation as open-source and the previously mentioned OpenTestBed has published their hardware implementation as well. Twenty-three of the testbed facilities do not provide any implementation details, thus making it impossible to reproduce their work. This is something that, when addressed, could potentially give a positive benefit to the testbed facility ecosystem, allowing for the faster and more diverse expansion of testbed facilities based on previous knowledge and resources.

4.6. Facility Count

Almost all of the testbed facilities are located in a single place, except for SensLab by Burin des Rosiers et al. [61] which is composed of four distributed wireless sensor networks distributed across France and interconnected by the internet, with a total of 1024 sensor nodes.

4.7. DUT Connection Interfaces

There are only three testbed facilities providing the JTAG connection interface to the DUT; this indicates that most of the testbed facilities are not focused on the debugging of the embedded software, as JTAG is the most popular and powerful connection used for these kinds of activities. On the other hand, the most typical DUT connection interface used is USB, which is found on 22 testbed facilities.
PhyNetLab by Venkatapathy et al. [59] provides a custom eight-pin connection interface to integrate on the board together with the management platform, which is connected to the testbed facility backend by using a ZigBee connection. This approach was chosen in order to make the implementation effective and allow for precise current measurements and support for energy harvesting capabilities but limits any usage of other DUTs because they must admit to the custom eight-pin interface.
EASITEST by Zhao et al. [37] uses Ethernet or Wi-Fi for the DUT connection, as the used hardware is capable of providing this connection type for management purposes. This is quite unusual because typically testbed facilities use IP stack capable devices as gateways for less powerful devices to be used as DUT, but in this testbed facility, two different types of devices are used as DUTs: a more powerful one with full Linux support and an MSP430-based device with attached Wi-Fi module.
Open CLORO by Portaluri et al. [25] uses a custom-made I2C communication over the RJ2 type connection to control a LEGO Mindstorms Platform. In this case, the unusual connection type is due to the necessity of the chosen DUT, which has limited connection capabilities.
Two distinct outliers who use low-rate wireless personal area network IEEE 802.15.4 for the connection interface for the DUT are the flexible and low-cost testbed facility FIST by Guo et al. [26] and a Fully reconfigurable plug-and-play wireless sensor network testbed by Bekan et al. [38]. They both are providing a wireless management channel, although each is using a somewhat different approach, as the first one is using two MAC stacks on top of the same physical layer and the second one is using two radio modules on the same sensor node. Both of the solutions claim that the approach has no effect on the user space of the wireless communication used in the experiments.
The only testbed facility to use Power over Ethernet (PoE) for the DUT connection interface is the USN testbed by Mir et al. [55]. The necessity for PoE connection severely limits the possible kinds of DUT in the sensor network domain, while providing a more capable and direct communication channel.
Another exotic DUT connection interface is the RF link used in the Ocean-TUNE UCONN testbed by Peng et al. [33] because this testbed facility is located below buoys underwater and thus cannot feasibly provide any wired connection. The testbed facility is aimed toward acoustic communication channel testing.

4.8. DUT Interaction Interfaces

For 32 testbed facilities, we have extracted 17 data points about provided DUT interaction interfaces. Only 3 testbed facilities provide GPIO interaction capabilities for the DUT while the majority (13) use the standardized UART-to-USB interface for any interactions, mainly by using bidirectional serial communication. The issue with providing more than the standard USB interaction interface is the necessity to manually connect the interface for each sensor node and possibly for each experiment, depending on the interface type.
There is one noteworthy distinction with the EDI TestBed by Ruskuls et al. [23], which provides a controllable eight-channel ADC and DAC to interact with the DUT, this is implemented in workstation hardware, which allows the user to use ADC to send analog signals to, and DAC to read signals from DUT. This feature allows for sensor data infusion into the embedded software for external sensor simulation or extraction of raw voltage data for verification purposes.

4.9. DUT Location

For 32 testbed facilities, we have extracted information about 60 DUT deployment locations. One single testbed facility can have multiple locations, which are all counted. Most of the DUT deployments (45) are located in the office environment. This directly corresponds to the laboratory environment where most of the testing takes place, and this usually corresponds to TRL5. The second popular deployment location with 14 deployments is outdoors in a city, which has the potential to help achieve TRL6 of the developed sensor network system if it is intended for outdoor use. The only noteworthy exception is PhyNetLab by Venkatapathy et al. [59], which is located in a materials handling research facility and sensor nodes are attached to smart shipping containers, thus, it provides an ideal place to emulate sorting and picking applications typical to material handling facilities. Extracted location accuracy amongst the testbed facilities ranked high to low with corresponding testbed facility counts: Precise (five), Room (nine), Building (nine), Premises (five), Not specified (four). The location accuracy of DUTs with which the user knows where the tested device is located is forming a natural distribution, with four testbed facility articles not providing any information about this metric.

4.10. DUT Count

Multiple testbed facility articles do not mention any references to the amount of available DUTs, they are City of Things by Latre et al. [27], FIST by Guo et al. [26], Dandelion by Wang et al. [24] and Open CLORO by Portaluri et al. [25]. There are also several testbed facilities described which provide a concept implementation with minimal DUT count with the aim only to evaluate the proposed approach and not continue to provide the services of a testbed facility.

4.11. Availability and Geographic Locations

Only 9 of 32 testbed facilities were available online; upon conducting our analysis of the testbed facilities, we discovered that a significant number of facilities did not provide sufficient information regarding their geographic location or accessibility. Specifically, we were unable to determine the location of 13 out of the total number of testbed facilities we analyzed. However, for the remaining 19 facilities, we were able to successfully determine and verify their exact locations, as shown in Figure 3.

4.12. Power Monitoring

Since sensor nodes are often powered by batteries, energy efficiency is one of the most crucial aspects of sensor network development. We have identified only six testbed facilities that are capable of providing functionality to evaluate this perspective: TWECIS by Pötsch et al. [30], FlockLab by Lim et al. [46], EDI TestBed by Ruskuls et al. [23], RT Lab by Pradeep et al. [52], SensLab by Burin des Rosiers et al. [61] and FIT IoT-LAB by Adjih et al. [63]. One of the fundamental functions is power monitoring to determine how much energy the device is consuming while performing different activities. All of the aforementioned testbed facilities are capable of providing this functionality, although the parameters may vary and sometimes are omitted from the articles, a summary is presented in Table 3. There are other features that can be utilized to improve energy efficiency, such as power profiling functionality provided by FlockLab by Lim et al. [46] and adjustable power supply functionality provided by EDI TestBed by Ruskuls et al. [23] and FlockLab by Lim et al. [46]. Power profiling allows users to simulate the battery discharge and observe the functionality of DUTs with a decreasing supply voltage.

5. Existing Challenges and Future Recommendations

After the systematic review of 359 testbed-related articles and 32 relevant testbed facilities, a few challenges stand out that the authors deem worth discussing and considering for the future ecosystem of WSN testbed facilities.

5.1. Reproducibility, Documentation, and Open Source

One issue that the review outlined is the limited reproducibility of the published testbed facilities, not just for verification, but also for the reuse of existing work that is of interest to research groups willing to replicate or expand the testbed facility. Making the testbed facility open-source and providing cost estimates would enable other researchers to consider and implement their own testbed facilities based on validated designs. As one step further, these testbed facilities could be interconnected, thus providing a scaled-up testbed facility thanks to combined resources. Our search found only one testbed facility SensLab by Burin des Rosiers et al. [61] that was distributed across France in four locations, rather than located in one place.
Among the reviewed testbed facilities, only nine were open-source. It was also noted that the supporting documentation such as user manuals and tutorials was a rather scarce resource, provided only by eight testbed facilities. These are aspects that must be addressed in future designs of testbed facilities, otherwise, the endeavor is at risk of living a short and single life with limited contribution to and from the community.

5.2. Ground Truth for Measurements, Communication, and Location

One of the challenges when developing WSNs is the issue of ground truth data, particularly in the context of measurements, communication, and location. Ground truth data refers to the accurate and precise measurement or determination of a particular parameter or variable in a given system or environment, which serves as the reference or benchmark for other measurements or evaluations. Ideally, ground truth is obtained by an irrefutable measurement of the parameter. While some testbed facilities provide ground truth measurements in some form, many do not. Perhaps in the future, the testbed facilities could address the necessity for ground truth data.
Firstly, the sensors used in WSNs may have varying levels of accuracy and precision, depending on such factors as cost, design, and calibration. This can lead to significant variations in the measurements obtained by different sensors, making it difficult to establish a reliable ground truth. A testbed facility providing a side channel with verified measurements of the environment could facilitate research questions related to sensor calibration, effective measurement rate, and other aspects of obtaining reliable measurements in a constrained environment.
Secondly, the communication between sensor nodes in a network can be affected by a range of factors such as interference, noise, and signal attenuation. This can result in data loss, corruption, or delays, further complicating the establishment of ground truth. A testbed facility could provide statistics about the real communication attempts, and characteristics of the communication environment, or perhaps even recording and replaying the communication environment noise and other factors during the tests.
Thirdly, the location of sensors, especially for mobile sensor networks, can also pose challenges to ground truth. The localization techniques and timing can introduce errors that lead to inaccurate or incomplete information on the location of sensors and the data they collect. A testbed facility should help to monitor and evaluate the estimated application—obtained locations with respect to the ground truth—the actual location of the nodes with precise timing.

5.3. Exotic Environments

Most of the reviewed deployments by testbed facilities were indoors (48), for example, occupying an office building environment, while only 14 were outdoors. Additionally, only one stood out by providing a testing environment for underwater sensor networks—Ocean-TUNE UCONN testbed by Peng et al. [33]. We would like to make a case for the need for more diversity in the testbed facilities concerning the environments that they provide. There are likely to be many researchers willing to test their work in a testbed facility deployment that is, for example, situated in an ocean, on a volcano or, perhaps, even in space, however, they may not have the access or means to deploy their design in such environments. Having opportunities to test and validate designs in trying environments might be valuable for the community for the following reasons: (1) they are closer to real-world deployments in potentially complex environments, and (2) cost and accessibility in such environments are a challenge. Therefore, it would be encouraging and exciting to see more testbed facilities providing a variety of exotic and challenging environments.

5.4. Ecosystem for Testbed Facilities

Looking back at the reviewed solutions raises a question: what would be a good ecosystem for WSN testbed facilities? An ecosystem where testbed facilities could be developed and served as needed for the research community around the world. The current outlook shows that the testbed facilities often are seldom used, at least as indicated by publications with references to them. Another related question is what makes a testbed facility a useful and desirable tool? Perhaps, the answer should be structured with respect to the hardware and software resources, research and development initiatives, and community support.
Concerning the hardware and software resources, there should be a wide range of devices, sensors, and networking equipment providing a variety of communication modes and protocols. However, all this cannot be included in just one testbed facility, therefore, a good ecosystem would provide a balanced and somewhat coordinated collection of testbed facilities where each has a certain hardware or software environment aspect in the focus. Even more so, should there arise multiple testbed facilities facilitating similar or compatible hardware, the ecosystem should provide means of integrating the two, thus enabling distributed testbed facilities rather than focusing on a single site.
In addition to hardware and software aspects, strong and active community support for such an ecosystem would be indispensable, promoting the development and use of existing testbed facilities as well as opportunities for networking and collaboration, access to training and education resources, and support for events and conferences that focus on testbed facilities. Such a community would also support research and development initiatives that aim to advance the state of the art in WSN testing, validation, and testbed facilities. This could include funding for research projects, collaborations with industry partners, and support for open-source software and hardware initiatives.

5.5. Testbed Facility Hub

Finally, while there have been several reviews of testbeds, there is also a centralized information hub (https://www.fed4fire.eu/testbeds (accessed on 25 May 2023)) about the available testbeds in various domains including WSN and IoT. However, only SmartSantander by Sanchez et al. [65] of the 32 testbed facilities identified in this review is part of this hub and thus exposed to the wider community. Without being listed in hubs like this, many testbed facilities may end up supporting only one or a handful of projects and staying idle for the rest of the time, while researchers and entrepreneurs could use them to speed up their endeavors.

6. Conclusions

Validation and testing for IoT and WSN applications is a challenging field due to the increasing complexity and heterogeneity of the systems. Testbed facilities aim to come to aid in this task, yet are limited by their functionality or availability, as indicated by the comprehensive overview of the state-of-the-art testbed facilities in this systematic review. The review also highlighted the challenges in providing testbed facilities that are sufficiently universal for their goals and outlined the possible directions for the future development of testbed facilities. Even though testbed facilities are developed and maintained by the scientific community, there is a significant lack in the aspects of reproducibility, documentation, adoption of open-source approaches, provision of ground truth data, exotic and challenging deployment locations, and collaboration between the testbed facilities. It was also identified that the testbed facilities tend to provide DUTs as custom-built devices as opposed to off-the-shelf devices and that there is a low supply of mobile devices and single-board computers as DUT. Additionally, it was unexpected to find that only 28% of the testbed facilities developed during the ten years are still available and that only 19% of them are providing any functionality related to power consumption monitoring, as this is an important aspect of wireless sensor network development.

Author Contributions

Conceptualization, J.J.; methodology, J.J. and V.A.; software, V.A.; validation, J.J., L.S. and K.O.; formal analysis, J.J.; investigation, A.E.; resources, J.J.; data curation, J.J.; writing—original draft preparation, J.J., A.E., V.A., R.B. and L.S.; writing—review and editing, K.O.; visualization, V.A.; supervision, J.J.; project administration, K.O.; funding acquisition, K.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by European Union’s Horizon 2020 research and innovation programme under grant agreement No 825196, TRINITY.

Data Availability Statement

The dataset used in this systematic review is published using Zenodo platform: Available Wireless Sensor Network and Internet of Things testbed facilities: dataset [68] and described in data article by Judvaitis et al. [4].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Suresh, P.; Daniel, J.V.; Parthasarathy, V.; Aswathy, R. A state of the art review on the Internet of Things (IoT) history, technology and fields of deployment. In Proceedings of the 2014 International Conference on Science Engineering and Management Research (ICSEMR), Chennai, India, 27–29 November 2014; pp. 1–8. [Google Scholar]
  2. State of IoT 2022: Number of Connected IoT Devices Growing 18% to 14.4 Billion Globally. Release date:18 May 2022. Available online: https://iot-analytics.com/number-connected-iot-devices/ (accessed on 23 January 2023).
  3. El-Darymli, K.; Ahmed, M.H. Wireless sensor network testbeds: A survey. In Wireless Sensor Networks and Energy Efficiency: Protocols, Routing and Management; IGI Global: Boulder, CO, USA, 2012; pp. 148–205. [Google Scholar]
  4. Judvaitis, J.; Abolins, V.; Elkenawy, A.; Ozols, K. Available Wireless Sensor Network and Internet of Things testbed facilities: Dataset. Open Res. Eur. 2022, 2, 127. Available online: https://open-research-europe.ec.europa.eu/articles/2-127 (accessed on 25 May 2023). [CrossRef]
  5. Abbas, Q.; Hassan, S.A.; Qureshi, H.K.; Dev, K.; Jung, H. A comprehensive survey on age of information in massive IoT networks. Comput. Commun. 2022, 7, 199–213. [Google Scholar] [CrossRef]
  6. Elkenawy, A.; Judvaitis, J. Transmission Power Influence on WSN-Based Indoor Localization Efficiency. Sensors 2022, 22, 4154. [Google Scholar] [CrossRef] [PubMed]
  7. Ormanis, J.; Medvedevs, V.; Abolins, V.; Gaigals, G.; Elsts, A. Signal Loss in Body Coupled Communication: Guide for Accurate Measurements. In Proceedings of the 2022 Workshop on Benchmarking Cyber-Physical Systems and Internet of Things CPS-IoTBench), Milan, Italy, 3–6 May 2022; pp. 22–27. [Google Scholar]
  8. Jaiswal, K.; Anand, V. EOMR: An energy-efficient optimal multi-path routing protocol to improve QoS in wireless sensor network for IoT applications. Wirel. Pers. Commun. 2020, 111, 2493–2515. [Google Scholar] [CrossRef]
  9. Kim, H.; Hong, W.K.; Yoo, J.; Yoo, S.e. Experimental research testbeds for large-scale WSNs: A survey from the architectural perspective. Int. J. Distrib. Sens. Netw. 2015, 11, 630210. [Google Scholar] [CrossRef]
  10. Omiyi, E.; Bür, K.; Yang, Y. A technical survey of wireless sensor network platforms, devices and testbeds. In A Report for the Airbus/ESPRC Active Aircraft Project EP/F004532/1: Efficient and Reliable Wireless Communication Algorithms for Active Flow Control and Skin Friction Drag Reduction; Lund University: Lund, Sweden, 2008. [Google Scholar]
  11. Imran, M.; Said, A.M.; Hasbullah, H. A survey of simulators, emulators and testbeds for wireless sensor networks. In Proceedings of the 2010 International Symposium on Information Technology, Kuala Lumpur, Malaysia, 15–17 June 2010; Volume 2, pp. 897–902. [Google Scholar]
  12. Steyn, L.P.; Hancke, G.P. A survey of wireless sensor network testbeds. In Proceedings of the IEEE Africon’11, Victoria Falls, Zambia, 13–15 September 2011; pp. 1–6. [Google Scholar]
  13. Farooq, M.O.; Kunz, T. Wireless sensor networks testbeds and state-of-the-art multimedia sensor nodes. Appl. Math. Inf. Sci. 2014, 8, 935. [Google Scholar] [CrossRef] [Green Version]
  14. Horneber, J.; Hergenröder, A. A survey on testbeds and experimentation environments for wireless sensor networks. IEEE Commun. Surv. Tutorials 2014, 16, 1820–1838. [Google Scholar] [CrossRef]
  15. Tonneau, A.S.; Mitton, N.; Vandaele, J. A survey on (mobile) wireless sensor network experimentation testbeds. In Proceedings of the 2014 IEEE International Conference on Distributed Computing in Sensor Systems, Marina Del Rey, CA, USA, 26–28 May 2014; pp. 263–268. [Google Scholar]
  16. Tonneau, A.S.; Mitton, N.; Vandaele, J. How to choose an experimentation platform for wireless sensor networks? A survey on static and mobile wireless sensor network experimentation facilities. Ad Hoc Netw. 2015, 30, 115–127. [Google Scholar] [CrossRef]
  17. Ma, J.; Wang, J.; Zhang, T. A survey of recent achievements for wireless sensor networks testbeds. In Proceedings of the 2017 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC), Nanjing, China, 12–14 October 2017; pp. 378–381. [Google Scholar]
  18. Zhou, X.; Gou, X.; Huang, T.; Yang, S. Review on testing of cyber physical systems: Methods and testbeds. IEEE Access 2018, 6, 52179–52194. [Google Scholar] [CrossRef]
  19. Judvaitis, J.; Abolins, V.; Mednis, A.; Balass, R.; Nesenbergs, K. The Definitive Guide to Actual Sensor Network Deployments in Research Studies from 2013–2017: A Systematic Review. J. Sens. Actuator Netw. 2022, 11, 68. [Google Scholar] [CrossRef]
  20. Judvaitis, J.; Mednis, A.; Abolins, V.; Skadins, A.; Lapsa, D.; Rava, R.; Ivanovs, M.; Nesenbergs, K. Classification of actual sensor network deployments in research studies from 2013 to 2017. Data 2020, 5, 93. [Google Scholar] [CrossRef]
  21. Kashani, M.H.; Madanipour, M.; Nikravan, M.; Asghari, P.; Mahdipour, E. A systematic review of IoT in healthcare: Applications, techniques, and trends. J. Netw. Comput. Appl. 2021, 192, 103164. [Google Scholar] [CrossRef]
  22. Ojha, T.; Misra, S.; Raghuwanshi, N.S. Internet of things for agricultural applications: The state of the art. IEEE Internet Things J. 2021, 8, 10973–10997. [Google Scholar] [CrossRef]
  23. Ruskuls, R.; Lapsa, D.; Selavo, L. Edi wsn testbed: Multifunctional, 3d wireless sensor network testbed. In Proceedings of the 2015 Advances in Wireless and Optical Communications (RTUWO), Riga, Latvia, 5–6 November 2015; pp. 50–53. [Google Scholar]
  24. Wang, Z.; Xu, Z.; Dong, B.; Xu, W.; Yang, J. Dandelion: An Online Testbed for LoRa Development. In Proceedings of the 2019 15th International Conference on Mobile Ad-Hoc and Sensor Networks (MSN), Shenzhen, China, 11–13 December 2019; pp. 439–444. [Google Scholar]
  25. Portaluri, G.; Ojo, M.; Giordano, S.; Tamburello, M.; Caruso, G. Open CLORO: An open testbed for cloud robotics. In Proceedings of the 2019 IEEE 24th International Workshop on Computer Aided Modeling and Design of Communication Links and Networks (CAMAD), Limassol, Cyprus, 11–13 September 2019; pp. 1–5. [Google Scholar]
  26. Guo, C.; Prasad, R.V.; He, J.J.; Jacobsson, M.; Niemegeers, I.G. Designing a flexible and low–cost testbed for Wireless Sensor Networks. Int. J. Hoc Ubiquitous Comput. 2012, 9, 111–121. [Google Scholar] [CrossRef]
  27. Latre, S.; Leroux, P.; Coenen, T.; Braem, B.; Ballon, P.; Demeester, P. City of things: An integrated and multi-technology testbed for IoT smart city experiments. In Proceedings of the 2016 IEEE International Smart Cities Conference (ISC2), Trento, Italy, 12–15 September 2016; pp. 1–8. [Google Scholar]
  28. AbdelHafeez, M.; AbdelRaheem, M. Assiut iot: A remotely accessible testbed for internet of things. In Proceedings of the 2018 IEEE Global Conference on Internet of Things (GCIoT), Alexandria, Egypt, 5–7 December 2018; pp. 1–6. [Google Scholar]
  29. AbdelHafeez, M.; Ahmed, A.H.; AbdelRaheem, M. Design and operation of a lightweight educational testbed for Internet-of-Things applications. IEEE Internet Things J. 2020, 7, 11446–11459. [Google Scholar] [CrossRef]
  30. Pötsch, A.; Berger, A.; Möstl, G.; Springer, A. TWECIS: A testbed for wireless energy constrained industrial sensor actuator networks. In Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA), Barcelona, Spain, 16–19 September 2014; pp. 1–4. [Google Scholar]
  31. Potsch, A. Ph. D. forum abstract: A scalable testbed infrastructure for embedded industrial wireless sensor and actuator networks. In Proceedings of the 2016 15th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), Vienna, Austria, 11–14 April 2016; pp. 1–2. [Google Scholar]
  32. Brodard, Z.; Jiang, H.; Chang, T.; Watteyne, T.; Vilajosana, X.; Thubert, P.; Texier, G. Rover: Poor (but Elegant) Man’s Testbed. In Proceedings of the 13th ACM Symposium on Performance Evaluation of Wireless Ad Hoc, Sensor, & UbiquitousNetworks, Valletta, Malta, 13–17 November 2016; pp. 61–65. [Google Scholar]
  33. Peng, Z.; Wei, L.; Wang, Z.; Want, L.; Zuba, M.; Cui, J.H.; Zhou, S.; Shi, Z.; O’Donnell, J. Ocean-TUNE UCONN testbed: A technology incubator for underwater communication and networking. In Proceedings of the 2014 Underwater Communications and Networking (UComms), Sestri Levante, Italy, 3–5 September 2014; pp. 1–4. [Google Scholar]
  34. Alsukayti, I.S. A multidimensional internet of things testbed system: Development and evaluation. Wirel. Commun. Mob. Comput. 2020, 2020, 1–17. [Google Scholar] [CrossRef]
  35. Brunisholz, P.; Dublé, E.; Rousseau, F.; Duda, A. WalT: A reproducible testbed for reproducible network experiments. In Proceedings of the 2016 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), San Francisco, CA, USA, 10–14 April 2016; pp. 146–151. [Google Scholar]
  36. Dludla, A.G.; Abu-Mahfouz, A.M.; Kruger, C.P.; Isaac, J.S. Wireless sensor networks testbed: ASNTbed. In Proceedings of the 2013 IST-Africa Conference & Exhibition, Nairobi, Kenya, 29–31 May 2013; pp. 1–10. [Google Scholar]
  37. Zhao, Z.; Yang, G.H.; Liu, Q.; Li, V.O.; Cui, L. EasiTest: A multi-radio testbed for heterogeneous wireless sensor networks. In Proceedings of the IET International Conference on Wireless Sensor Network 2010 (IET-WSN 2010), Beijing, China, 15–17 November 2010; pp. 104–108. [Google Scholar]
  38. Bekan, A.; Mohorcic, M.; Cinkelj, J.; Fortuna, C. An architecture for fully reconfigurable plug-and-play wireless sensor network testbed. In Proceedings of the 2015 IEEE Global Communications Conference (GLOBECOM), San Diego, CA, USA, 6–10 December 2015; pp. 1–7. [Google Scholar]
  39. Wen, J.; Ansar, Z.; Dargie, W. Mobilab: A testbed for evaluating mobility management protocols in wsn. In Proceedings of the Testbeds and Research Infrastructures for the Development of Networks and Communities: 11th International Conference, TRIDENTCOM 2016, Hangzhou, China, 14–15 June 2016; Revised Selected Papers. Springer: Berlin/Heidelberg, Germany, 2017; pp. 49–58. [Google Scholar]
  40. Förster, A.; Förster, A.; Garg, K.; Giordano, S.; Gambardella, L.M. MOTEL: Mobility Enabled Wireless Sensor Network Testbed. Ad Hoc Sens. Wirel. Netw. 2015, 24, 307–331. [Google Scholar]
  41. Jiménez-González, A.; Martínez-de Dios, J.R.; Ollero, A. An integrated testbed for cooperative perception with heterogeneous mobile and static sensors. Sensors 2011, 11, 11516–11543. [Google Scholar] [CrossRef] [Green Version]
  42. Alvanou, A.G.; Zervopoulos, A.; Papamichail, A.; Bezas, K.; Vergis, S.; Stylidou, A.; Tsipis, A.; Komianos, V.; Tsoumanis, G.; Koufoudakis, G.; et al. CaBIUs: Description of the enhanced wireless campus testbed of the Ionian University. Electronics 2020, 9, 454. [Google Scholar] [CrossRef] [Green Version]
  43. Schaerer, J.; Zhao, Z.; Carrera, J.; Zumbrunn, S.; Braun, T. Sdnwisebed: A software-defined wsn testbed. In Proceedings of the Ad-Hoc, Mobile, and Wireless Networks: 18th International Conference on Ad-Hoc Networks and Wireless, ADHOC-NOW 2019, Luxembourg, Luxembourg, 1–3 October 2019; Proceedings 18. Springer: Berlin/Heidelberg, Germany, 2019; pp. 317–329. [Google Scholar]
  44. Olivares, T.; Royo, F.; Ortiz, A.M. An experimental testbed for smart cities applications. In Proceedings of the 11th ACM International Symposium on Mobility Management and Wireless Access, Barcelona, Spain, 3–8 November 2013; pp. 115–118. [Google Scholar]
  45. Munoz, J.; Rincon, F.; Chang, T.; Vilajosana, X.; Vermeulen, B.; Walcarius, T.; Van de Meerssche, W.; Watteyne, T. OpenTestBed: Poor man’s IoT testbed. In Proceedings of the IEEE INFOCOM 2019-IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), Paris, France, 29 April–2 May 2019; pp. 467–471. [Google Scholar]
  46. Lim, R.; Ferrari, F.; Zimmerling, M.; Walser, C.; Sommer, P.; Beutel, J. Flocklab: A testbed for distributed, synchronized tracing and profiling of wireless embedded systems. In Proceedings of the 12th International Conference on Information Processing in Sensor Networks, Philadelphia, PA, USA, 8–11 April 2013; pp. 153–166. [Google Scholar]
  47. Trüb, R.; Da Forno, R.; Gsell, T.; Beutel, J.; Thiele, L. Demo abstract: A testbed for long-range lora communication. In Proceedings of the 2019 18th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), Montreal, QC, Canada, 16–18 April 2019; pp. 342–343. [Google Scholar]
  48. Salmins, A.; Ozols, K.; Ruskuls, R. Data management in TestBed for large scale wireless sensor networks. In Proceedings of the 2015 Advances in Wireless and Optical Communications (RTUWO), Riga, Latvia, 5–6 November 2015; pp. 54–57. [Google Scholar]
  49. Judvaitis, J.; Nesebergs, K.; Balass, R.; Greitans, M. Challenges of DevOps ready IoT Testbed. In Proceedings of the CEUR Workshop Proceedings, Lisbon, Portugal, 26 March 2019; Volume 2442, pp. 3–6. [Google Scholar]
  50. Salmins, A.; Judvaitis, J.; Balass, R.; Nesenbergs, K. Mobile wireless sensor network TestBed. In Proceedings of the 2017 25th Telecommunication Forum (TELFOR), Belgrade, Serbia, 21–22 November 2017; pp. 1–4. [Google Scholar]
  51. Judvaitis, J.; Salmins, A.; Nesenbergs, K. Network data traffic management inside a TestBed. In Proceedings of the 2016 Advances in Wireless and Optical Communications (RTUWO), Riga, Latvia, 3–4 November 2016; pp. 152–155. [Google Scholar]
  52. Pradeep, P.; Divya, P.; Devi, R.A.; Rekha, P.; Sangeeth, K.; Ramesh, M.V. A remote triggered wireless sensor network testbed. In Proceedings of the 2015 Wireless Telecommunications Symposium (WTS), New York, NY, USA, 15–17 April 2015; pp. 1–7. [Google Scholar]
  53. Doddavenkatappa, M.; Chan, M.C.; Ananda, A.L. Indriya: A low-cost, 3D wireless sensor network testbed. In Proceedings of the Testbeds and Research Infrastructure. Development of Networks and Communities: 7th International ICST Conference, TridentCom 2011, Shanghai, China, 17–19 April 2011; Revised Selected Papers 7. Springer: Berlin/Heidelberg, Germany, 2012; pp. 302–316. [Google Scholar]
  54. Ju, X.; Zhang, H.; Sakamuri, D. NetEye: A user-centered wireless sensor network testbed for high-fidelity, robust experimentation. Int. J. Commun. Syst. 2012, 25, 1213–1229. [Google Scholar] [CrossRef]
  55. Mir, Z.H.; Park, H.; Moon, Y.B.; Kim, N.S.; Pyo, C.S. Design and deployment of testbed for experimental sensor network research. In Proceedings of the Network and Parallel Computing: 9th IFIP International Conference, NPC 2012, Gwangju, Korea, 6–8 September 2012; Proceedings 9. Springer: Berlin/Heidelberg, Germany, 2012; pp. 264–272. [Google Scholar]
  56. Gao, Y.; Zhang, J.; Guan, G.; Dong, W. LinkLab: A scalable and heterogeneous testbed for remotely developing and experimenting IoT applications. In Proceedings of the 2020 IEEE/ACM Fifth International Conference on Internet-of-Things Design and Implementation (IoTDI), Sydney, Australia, 21–24 April 2020; pp. 176–188. [Google Scholar]
  57. Nati, M.; Gluhak, A.; Abangar, H.; Headley, W. Smartcampus: A user-centric testbed for internet of things experimentation. In Proceedings of the 2013 16th International Symposium on Wireless Personal Multimedia Communications (WPMC), Atlantic City, NJ, USA, 24–27 June 2013; pp. 1–6. [Google Scholar]
  58. Nati, M.; Gluhak, A.; Domaszewicz, J.; Lalis, S.; Moessner, K. Lessons from smartcampus: External experimenting with user-centric internet-of-things testbed. Wirel. Pers. Commun. 2017, 93, 709–723. [Google Scholar] [CrossRef]
  59. Venkatapathy, A.K.R.; Roidl, M.; Riesner, A.; Emmerich, J.; ten Hompel, M. PhyNetLab: Architecture design of ultra-low power Wireless Sensor Network testbed. In Proceedings of the 2015 IEEE 16th International Symposium on A World of Wireless, Mobile and Multimedia Networks (WoWMoM), Boston, MA, USA, 14–17 June 2015; pp. 1–6. [Google Scholar]
  60. Falkenberg, R.; Masoudinejad, M.; Buschhoff, M.; Venkatapathy, A.K.R.; Friesel, D.; ten Hompel, M.; Spinczyk, O.; Wietfeld, C. PhyNetLab: An IoT-based warehouse testbed. In Proceedings of the 2017 Federated Conference on Computer Science and Information Systems (FedCSIS), Prague, Czech Republic, 3–6 September2017; pp. 1051–1055. [Google Scholar]
  61. Burin des Rosiers, C.; Chelius, G.; Fleury, E.; Fraboulet, A.; Gallais, A.; Mitton, N.; Noël, T. SensLAB: Very Large Scale Open Wireless Sensor Network Testbed. In Proceedings of the Testbeds and Research Infrastructure. Development of Networks and Communities: 7th International ICST Conference, TridentCom 2011, Shanghai, China, 17–19 April 2011; Revised Selected Papers 7. Springer: Berlin/Heidelberg, Germany, 2012; pp. 239–254. [Google Scholar]
  62. des Roziers, C.B.; Chelius, G.; Ducrocq, T.; Fleury, E.; Fraboulet, A.; Gallais, A.; Mitton, N.; Noel, T.; Valentin, E.; Vandaële, J. Two demos using senslab: Very large scale open wsn testbed. In Proceedings of the 2011 International Conference on Distributed Computing in Sensor Systems and Workshops (DCOSS), Barcelona, Spain, 27–29 June 2011; pp. 1–2. [Google Scholar]
  63. Adjih, C.; Baccelli, E.; Fleury, E.; Harter, G.; Mitton, N.; Noel, T.; Pissard-Gibollet, R.; Saint-Marcel, F.; Schreiner, G.; Vandaele, J.; et al. FIT IoT-LAB: A large scale open experimental IoT testbed. In Proceedings of the 2015 IEEE 2nd World Forum on Internet of Things (WF-IoT), Milan, Italy, 14–16 December 2015; pp. 459–464. [Google Scholar]
  64. Harter, G.; Pissard-Gibollet, R.; Saint-Marcel, F.; Schreiner, G.; Vandaele, J. FIT IoT-LABA: Large Scale Open Experimental IoT Testbed. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking, Paris, France, 7–11 September 2015; pp. 176–178. [Google Scholar]
  65. Sanchez, L.; Muñoz, L.; Galache, J.A.; Sotres, P.; Santana, J.R.; Gutierrez, V.; Ramdhany, R.; Gluhak, A.; Krco, S.; Theodoridis, E.; et al. SmartSantander: IoT experimentation over a smart city testbed. Comput. Netw. 2014, 61, 217–238. [Google Scholar] [CrossRef] [Green Version]
  66. Lanza, J.; Sánchez, L.; Muñoz, L.; Galache, J.A.; Sotres, P.; Santana, J.R.; Gutiérrez, V. Large-scale mobile sensing enabled internet-of-things testbed for smart city services. Int. J. Distrib. Sens. Netw. 2015, 11, 785061. [Google Scholar] [CrossRef] [Green Version]
  67. Jara, A.J.; Genoud, D.; Bocchi, Y. Big data for smart cities with KNIME a real experience in the SmartSantander testbed. Softw. Pract. Exp. 2015, 45, 1145–1160. [Google Scholar] [CrossRef]
  68. Judvaitis, J.; Abolins, V.; Elkenawy, A.; Ozols, K. Available Wireless Sensor Network and Internet of Things testbed facilities: Dataset. Zenodo 2022. [Google Scholar] [CrossRef]
Figure 1. An overview of the systematic review methodology and the results for each phase.
Figure 1. An overview of the systematic review methodology and the results for each phase.
Jsan 12 00048 g001
Figure 2. Testbed facility main articles by year.
Figure 2. Testbed facility main articles by year.
Jsan 12 00048 g002
Figure 3. Map of all located testbed facilities. Blue markers—the exact location, red markers—the city of the location, and orange markers—the country of the location. On the left panel is the whole world, and on the right panel is Europe, where the testbeds are located much closer to each other.
Figure 3. Map of all located testbed facilities. Blue markers—the exact location, red markers—the city of the location, and orange markers—the country of the location. On the left panel is the whole world, and on the right panel is Europe, where the testbeds are located much closer to each other.
Jsan 12 00048 g003
Table 1. Testbed facilities.
Table 1. Testbed facilities.
Name and Main ArticleReferencesYearDUTsActiveMain Features
Dandelion [24] 2019NAYesIndoor and outdoor DUTs, sub-GHz LoRa DUTs
Open CLORO [25] 2019NANoDUTs include moving robot, Android app for robot movement control
FIST [26] 2012NANoWirelessly managed DUTs, the same radio channel for control and experimentation, tree topology
City of Things [27] 2016NAYesUrban deployment, integrated big data analysis, includes wearables, data visualization
AssIUT IoT [28]Update [29]20184NoDUTs supporting LoRa, WiFi, ZigBee and Cellular communication, web interface
TWECIS [30]Abstract [31]20144NoGPIO event timestamping, support of gdb, network-wide power monitoring
Rover [32] 20168NoOpen source, cheap to implement
Ocean-TUNE UCONN testbed [33] 201410NoTestbed facility for underwater WSN, Multiple configurable acoustic modems, web interface
Multidimensional Internet of Things Testbed System [34] 202012NoDUTs supporting LoRaWAN, 6LoWPAN, ZigBee and BLE communication, power monitoring
WaIT [35] 201612YesReproducible platform, DUT OS image management, automated topology discovery, cheap, open-source
ASNTbed [36] 201316NoDedicated base station node
EASITEST [37] 201020NoMulti-radio DUTs, web interface
Fully reconfigurable plug-and-play wireless sensor network testbed [38] 201520YesConfigurable wireless transceivers, CoAP handlers for interaction
MobiLab [39] 201721NoMobile DUTs, robot movement control, CLI
MOTEL [40] 201523NoDUTs include two different moving robots and a camera, robot movement control
Integrated Testbed for Cooperative Perception [41] 201127YesMobile and static DUTs, a large variety of sensors, time synchronization, indoor positioning, robot movement control
CaBIUs [42] 202030NoCustom designed programming language, web interface
SDNWisebed [43] 201940NoSDN networking support, traffic statistics for DUTs
I3ASensorBed [44] 201346NoWide range of sensors, including CO2, presence, smoke, etc.
OpenTestBed [45] 201980YesFully open-source
FlockLab [46]Demo [47]2013106YesGPIO tracing and actuation, power monitoring, time synchronization
EDI TestBed [23]Updates [48,49,50,51]2015110YesGPIO interaction, power monitoring, ADC and DAC interaction, versatile deployment options, CLI
RT Lab [52] 2015115YesIndoor and outdoor DUTs, online code editing, parameterized control, digital multimeter for power monitoring
Indriya [53] 2012127YesSmall maintenance costs, distributed in three floors
NetEye [54] 2012130YesTopology control, health monitoring, policy-based scheduling
USN testbed [55] 2020142NoIndoor and outdoor DUTs, management GUI
LinkLab [56] 2020155YesWeb interface, online compilation, self-inspection module
SmartCampus [57]Demo [58]2013240YesDUTs include Smartphones, public display infrastructure for user interaction, topology explorer
PhyNetLab [59]Update [60]2015350YesDeployed in materials handling facility, time synchronization, data visualization, OTA reprogramming, energy consumption accounting
SensLab [61]Demo [62]20121024No4 interconnected locations, mobile DUTs, pre-made virtual machines for development
FIT IoT-LAB [63]Demo [64]20152845YesMobile robot nodes over six facilities, five different hardware platforms, open-source visualization, and interaction tools
SmartSantander [65]Update [66], Demo [67]20143530YesUrban deployment, multi-tier DUTs with different network technologies, mobile and static DUTs, end-user involvement
Table 2. Provided DUTs, sensors, and actuators.
Table 2. Provided DUTs, sensors, and actuators.
DUTSensors and Actuators
NameCustomLPEDHPEDMDSBCIMUAcousticAir QualityPresenceLocationEnvironmentEnergyActuators
DandelionNA
Open CLORO NA NANA
FIST NA NA
City of ThingsNA NA
AssIuT AIoT 4 8
TWECIS4NA
Rover 8
Ocean-TUNE UCONN testbed10 6
Multidimensional Internet of Things Testbed System 12 1236 36
WaIT 12
ASNTbed 16 16 48
EASITEST 20 40
Fully reconfigurable plug-and-play wireless sensor network testbed20
MobiLab 20 1 60
MOTEL 23 46
Integrated Testbed for Cooperative Perception21 6 21
CaBIUs 30 30 90
SDNWisebed 40 120
I3ASensorBed 46 184138 9246
OpenTestBed 80 160
FlockLab562624 78
EDI TestBed 110 330
RT Lab 115
Indriya 127 254127 508
NetEye 130 130
USN Testbed1421 1141 1562
LinkLab 10050 545 235 45
SmartCampus 200 40 200 800200
PhyNetLab350 350 1050 350
SensLab1024 1024 2048
FIT IoT-LAB9611441488117 1488 10233814164
SmartSantander3530 5030390 1176
Total5253215216621245721681554250156746011,730246395
Table 3. Testbed facilities with power monitoring capabilities.
Table 3. Testbed facilities with power monitoring capabilities.
Testbed FacilityResolutionFrequencyRangeFeatures
EDI TestBed100 μA100 kHz0.1 mA–100 mAMonitoring, Adjustable Power Supply
FIT IoT LABNANANAMonitoring
FlockLab10 nA56 kHzNAMonitoring, Power Profiling, Adjustable Power Supply
RT LabNANANAMonitoring
SensLAB10 μA1 kHzNAMonitoring
TWECISNANA1 μA–100 mAMonitoring
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Judvaitis, J.; Abolins, V.; Elkenawy, A.; Balass, R.; Selavo, L.; Ozols, K. Testbed Facilities for IoT and Wireless Sensor Networks: A Systematic Review. J. Sens. Actuator Netw. 2023, 12, 48. https://doi.org/10.3390/jsan12030048

AMA Style

Judvaitis J, Abolins V, Elkenawy A, Balass R, Selavo L, Ozols K. Testbed Facilities for IoT and Wireless Sensor Networks: A Systematic Review. Journal of Sensor and Actuator Networks. 2023; 12(3):48. https://doi.org/10.3390/jsan12030048

Chicago/Turabian Style

Judvaitis, Janis, Valters Abolins, Amr Elkenawy, Rihards Balass, Leo Selavo, and Kaspars Ozols. 2023. "Testbed Facilities for IoT and Wireless Sensor Networks: A Systematic Review" Journal of Sensor and Actuator Networks 12, no. 3: 48. https://doi.org/10.3390/jsan12030048

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop