Keywords

1 Introduction

Due to the rapid development of the science and technology industry, the way of data exchange between people, people, and technology, people and field have no boundaries. In 2014, the World Semiconductor Council is showing the next big thing is the internet of things [1]. In recent years, the Internet of Things has been widely used in the smart life, and gradually applied to artistic creation. Through the Internet of Things and sensors, data can act a control and presentation role in interactive art. In 2018, we curated the “2018 Tsing Hua Effects – IoT Technology and Art Festival” [2], integrating the data of Tsing Hua University into artworks. We store big data in the cloud. Through sensing and access to the Internet of Things, we also integrate and control outdoor public artworks for an interactive interface. The data in the artwork can be appreciated, discussed and accessed, and even become a part of the artwork. However, most the interactive art works are appreciated and participated individually, and the data between works is rarely connected. Therefore, we curated “2019 Tsing Hua Effects – Cross-Media IoT Technology and Art Festival” [3], and developed “Cross-Media IoT System”, let data cross between different media, such as mobile phones, computers, social media, etc. and input into artworks, or even cross from one of artwork into another. When data is transformed in different media, it will be deconstructed and re-combined, also role of data (input/output) will be change, and data may change from text to images, from images to sounds, and even from digital to analog forms and data will be to control installation art, we call this phenomenon “Data Liquidity”. This phenomenon is like water, it will change the shape of the convergence with different containers. In our research, data will change its format and role of input and output with different applications. This is due to the advancement of digital technology, which allows data to be change in liquidity state. We call this is “Data Paradigm Shift”.

2 Related Work

This chapter will explain the related works of paradigm shift in the field of humanities and arts. With the advent of the digital and technological age, the digital paradigm shift was proposed which influenced the creative methods of technology and art. We will discuss related creations and works of IoT applications in scientific and technological works.

2.1 Paradigm Shift Related Studies in Humanities

This chapter will explain the related research of Paradigm Shift in the field of humanities and arts. With the advent of the digital and technological era, the digital paradigm shift was proposed, which influenced the creative methods of science and technology. We will discuss the related creation and research of IoT applications in scientific and technological works.

Kuhn proposed the Paradigm Shift in 1962. This term was used to describe the changes in the basic concepts and experimental practices of science in the scientific category which also affected the development of scientific research [4]. Kolay proposed a digital paradigm shift in 2016 when classic paintings in Indian culture were transformed into digital forms. The human characters in traditional Indian classic paintings are translated into digital forms that are more easily understood by the general public through the conversion of digital media. This makes it easier to protect and disseminate the cultural heritage value of traditional Indian art [5]. Bowen and Giannini and others mentioned in 2018 that there is a paradigm shift in the Digital Ecosystem. Through the Internet, people’s identities and spaces easily meet in virtual events, which is also a case of digital paradigm shift. [6]. The above-mentioned paradigm shift is mostly a common paradigm shift application. In our thesis, we emphasize that data is transferred between installation art works in the context of the Internet of Things. The format and characteristics of the data will be changed. We call it “Data Paradigm Shift”. This will be explained further in later chapters.

2.2 Research on IoT Technology and Art Works

The development of wireless transmission technology in the past few years has been quite rapid. From the transmission methods of 802.11, Bluetooth, and 4G networks to the large-scale technological advances of LoRa, WiFi6, and 5G net-works proposed today, the idea of “interconnecting everything” has gradually evolved. come true. The performance of our internet technology and smart phones is getting stronger and stronger. People can appreciate the form of artistic creation through different interactive interfaces. The Internet of Things technology extended from wireless sensing network technology also brings more possibilities for viewers to appreciate artistic creation.

Bill Fontana of the United Kingdom proposed the “Harmonic Bridge” work using wireless sensing network technology in 2006 [7]. They use the vibration or wind strength collected by the sensors and calculate it through wireless sensing technology, so that participants will produce pleasant harmony music when walking across the bridge. In the first Taipei Digital Arts Festival and ACM Multimedia seminar in 2006, Su-Chu Hsu, **g-Yao Lin, Caven Chen and others proposed the “One Million Heartbeat” interactive art work [8]. This work collects participants’ identity data, including gender and blood type, through a wireless sensing network. After the system has collected one million heartbeat data, the identity data of the participants will determine the gender and blood type of the twins in the work.

Aaron Koblin, Nik Hafermaas, and Dan Goods of the United States established the “eCLOUD” project at San Jose International Airport [9]. Since its establishment in 2007, it has used electrical variable glass technology and the Internet of Things (IoT) system to collect climates around the world and create a dynamic and optoelectronically changing public sculpture. The work can dynamically let people passing by understand the weather changes in the cities they are visiting, like clouds in the sky.

The “ALAVs 2.0 (Autonomous Light Air Vessels)” project created by Jed Berk in the United States in 2012 [10] is to use indoor small flying boats combined with wireless sensing network technology to feed the spacecraft through the participants like or dislike, so the spacecraft Has an anthropomorphic form and personality.

An interactive artwork of “Mushroom Story” produced by Su-Chu Hsu, Shih-Ta Liu, Po-Yao Wu in 2018 [11]. This work uses a mobile phone to scan the QR-Code to open the WebAPP. After entering any text and uploading it to the cloud server, the system performs text-to-speech and then automatically downloads it to the outdoor mushroom modeling art installation. When participants approach the mushroom modeling installation, voice data will be automatically activated after being detected by the sensor. At the same time, the latest three artificial speeches are played randomly. This makes speech data a part of interactive art-work.

3 Cross-Media IoT System

We designed a Cross-Media IoT System mainly to allow data to be streamed between different artworks, and the format of the data can be transformed due to the appearance of different artworks, and the data is liquidity. Figure 1 Cross-Media Control System is one of the subsystems and the core part. It’s embedded in every interactive installation. For different forms of artwork, we use an embedded sensor IC to detect interactive in the installation art. It can also use the mobile phone’s interactive method to use G-Sensor to dynamically detect and make decisions, and then send data to the server. The data will be returned to the mobile phone in time through processing on the server-side, showing different interactive feedback.

On the mobile phone, we designed related APPs and WebAPPs for each artwork, and both types of APPs can interact with the artwork. Users who download apps, in general, can turn on Bluetooth connection support via native apps. Conversely, WebAPP can only be connected through account binding. Through the mobile APP/WebAPP program development, the built-in G-Sensor and wireless transmission of the mobile phone will be used to allow users to take the mobile APP/WebAPP to participate in different Interactive Installation Arts in the space. The data is inputted into the artworks through the Cross-Media Control System through the mobile phone’s APP/WebAPP as the medium, and the artworks are controlled to interact.

3.1 The Structure of Cross-Media IoT System

The Cross-Media IoT System includes IoT Server and multiple interactive installation art, as shown in Fig. 1. Each interactive installation art is embedded in the Cell Phone APP/WebAPP included in the Cross-Media Control System. The IoT Server is responsible for receiving the sensing signal of the Cross-Media Control System, and the mobile phone is used as a carrier for data sending and receiving in the system. Use the G-Sensor of the user’s mobile phone to perform “scoop” and “pour-in” motion detection, and input data into the installation art or output from the installation art. The format of the data will be transformed into different artworks, and the roles of input and output will be changed.

Cross-Media Control System is the core unit of the entire Cross-Media IoT System. It includes two parts, one is the Cell Phone APP/WebAPP and Non-Contact Capacitive Data Control Interface. When the user interacts with the artwork using the Cell Phone APP/WebAPP, the input data is transmitted to the server through the interactive control interface, and the server converts the information into data, which is then returned to the user’s cell phone for presentation. Then the user must detect the flip** movement of the hand through the G-Sensor of the cell phone, make the Non-Contact Capacitive Data Control Interface sense, and trigger the operation of the interactive device at the same time so that the artwork produces different interactive results. As a result of the interaction, data is generated, and the format of the data may be converted from the text, digital codes, audio and video, and even digital and analog conversion. The converted data controls different installation arts to do different creative interactive presentations, and even the data streams are in different installation arts, affecting each other.

Fig. 1.
figure 1

The structure of Cross-Media IoT System

3.2 Cross-Media Control System

Cross-Media Control System is embedded in each interactive installation art. The purpose is to allow users to use two interactive operation methods designed by “Data Retrieval” to input data and “Data Pouring In” to output data, as shown in Figs. 7 and 9. The main technologies include: “Non-Contact Capacitive Data Control Interface” and “Tangible Interface in Mobile Phones”. The Non-Contact Capacitive Data Control Interface uses a capacitance sensor as the main interface to detect the piezoelectric change of the sensed capacitance. The output signal is sent to the microcontroller to receive data and then transmitted to the IoT Server through the wireless transmission module. The data is transmitted to the cell phones with Tangible Interface through the processing of the IoT Server. In the following, these two control interfaces will be described in detail.

Non-Contact Capacitive Data Control Interface.

Non-Contact Capacitive Data Control Interface is mainly used to achieve two interactive operation methods: “scoop” and “pour-in”. This interface contains high-sensitivity capacitive sensing ICs. In this part, we use Azoteq’s IQS128 sensing IC. IQS128 is highly penetrating and can be easily embedded in materials containing wooden exhibits or acrylic exhibits, and it can pass through a material of about 6 mm or a glass material of 10 mm and can sense a sensing area of ​​about 15 cm. In the part of the signal changes when processing capacitance induction, we use Arduino to process the signal and use the Bluetooth module to send the Beacon signal. When a user opens the APP while holding a cell phone, it will scan the Bluetooth device and find the Bluetooth module name of the interface. After connecting, you can connect and bind the cell phone. On the contrary, WebAPP users need to login into account to perform the cell phone binding status, and the user can choose one of the methods to interact. After binding, the system will start to detect whether there is any “scoop” or “pour-in” action using the cell phone. When the capacitive sensing module detects that the user uses the cell phone to interact with each other, and the proximity area is less than 5 cm, it can trigger the system to connect to the IoT Server for data transmission through the wireless transmission module, and change interactive content on the APP/WebAPP.

Tangible Interface in Mobile Phones.

This chapter will introduce the research results of G-Sensor control technology that we have developed in the past with smart phones as tangible interface [12]. It uses the X, Y, and Z axis (Fig. 2. left) data results sensed by the acceleration sensor in the tangible interface to analyze the characteristics of the following three actions, including: bottom to top, Scoop from left to right, Scoop from right to left (Fig. 2. Right) [13].

Fig. 2.
figure 2

Tangible interface with G-Sensor

Figures 3, 4 and 5 below are the results of experiments we have done in the past. Figure 3 below is an analysis chart from bottom to top. It can be seen from the figure that during the “scoop” action, the Y-axis data will increase and the Z-axis data will decrease.

Fig. 3.
figure 3

G-Sensor with left to right

The part of scoop from left to right is shown in Fig. 4 below. In the blue frame, because the mobile phone is moving to the right, the acceleration on the X axis is negative until it stops. Due to the effect of gravity, the acceleration of the Z axis is positive and continues to increase.

Fig. 4.
figure 4

G-Sensor with bottom to top

Figure 5 below is the data analysis of scoo** from right to left. In the orange box, the mobile phone is moving to the left, so the acceleration of the X axis is positive. Because the mobile phone is horizontal after completing the scoo** action, the final X axis data trend Nearly zero; in addition, the Z-axis data rose significantly because the mobile phone was going to lift up and was affected by gravity.

Fig. 5.
figure 5

G-Sensor with right to left

Based on the above-mentioned research on the G-Sensor component control technology, the Tangible Interface can detect two gestures, “scoo**” and “pour-in” of the phone through the acceleration sensor, and transmit the analysis results via wireless transmission To Cross-Media Control System. If the coordinates of the mobile phone match the position sensed by the non-contact capacitance data control Interface, the work data can be transmitted while the mobile phone is tilted for “scoo**” or “pour-in”.

After analyzing these motion detection data, we combined the two data interaction methods of “data retrieval” and “data pouring in” to allow users to input and output data when interacting with the device. When the user interacts through the mobile phone’s APP/WebAPP interface, the mobile phone and the non-contact capacitance data control interface perform calculations. The mobile phone’s APP/WebAPP will also determine whether the current state is “data retrieval” or “data pouring in”. The interactive behavior of the interactive device makes the result appear different content.

4 Implementations

This section mainly exemplifies and explains the implementation of Cross-Media IoT System. All the works of "Tsing Hua Effects 2019 – Cross-Media Technology and Art Festival" run under our Cross-Media IoT System and practice the concept of Data Paradigm Shift. To develop this system, we must obtain government subsidies, and we especially appreciate the Ministry of Education Republic of China (Taiwan) [14]. In festival, there are about 10 interactive artworks scattered on the campus. We use "Morse Code" and "Cross-Media Representor" to explain Data Liquidity (data liquefaction), how data can be transferred be-tween different works, and explain the concept of Data Paradigm Shift.

4.1 The Steps of Cross-Media in Installation Art Implementation

The “Morse Code” artwork author is Ya-Lun Tao, and this artwork is one of the works exhibited by the "Tsing Hua Effects 2019 -Cross-Media Technology and Art Festival" at National Tsinghua University in Taiwan [15]. The author is Ya-Lun Tao. The creative concept of the work comes from the message that different objects want to express. Through this device, I want to speak to the God in the sky, to the loved ones in the sky, to the creatures in outer space, to the past, to the future and the desire to speak to yourself is realized through this interactive device.

It mainly contains the following steps:

  • Step1: Data input from cell phone

    People took their cell phones and walked to the “Morse Code”, scanned the QR Code of the artwork, obtained the corresponding APP or WebAPP and bound them, and entered the text of blessing words on the cell phone, and then enter the text to convert Morse code from IoT server, as shown in Fig. 6.

    Fig. 6.
    figure 6

    Scan QR-Code and enter text of blessing words

  • Step2: “Data Pouring In” installation art

    People took the cell phone to the front of the “Morse Code” installation, there is a Non-Contact Capacitive Data Control Interface embedded in the installation, people through the cell phone’s G-Sensor detection and transition, let text of blessing converted into Morse code data, that can be “Data Pouring In” to installation art, as shown in Fig. 7.

    Fig. 7.
    figure 7

    Using cell phone for “Data Pouring In” to installation art

  • Step3: Data cross into “Liquidity Transfer”

    “Morse Code” after receiving the Morse code data, the installation activates the shutter control to let the light beam reach the sky with the Morse code. At this time, the format of the data has changed from digital to the analog state of the motor control shutter structure, as shown in Fig. 8.

    Fig. 8.
    figure 8

    Receiving the Morse code data and turning the shutter mechanism

  • Step4: “Data Retrieval” from installation art

    Anyone can “retrieval” the words that have recently been converted into the Morse code from the “Morse Code” installation, and then walk to another installation art “Cross-Media Representor” in Pigeon Plaza, as shown in Fig. 9. That is, the input of the blessed discourse text data of “Morse Code” is outputted to another work “Cross-Media Representor” by being retrieved.

    Fig. 9.
    figure 9

    Using cell phone for “Data Retrieval” from installation art

  • Step5: ‘Data Pouring In” another installation art

    People walked to Pigeon Plaza and similarly transmitted through the cell phone G-Sensor detection and Non-Contact Capacitance Data Control Interface, so that the retrieved data was poured into the installation art of “Cross-Media Representor”, as shown in Fig. 10.

    Fig. 10.
    figure 10

    Participants pouring in the retrieved data into the installation art

  • Step6: Data cross into installation art and data “Liquidity Retransferred”

    “Cross-Media Representor” After receiving the data, the device deconstructs the Morse code data into the original blessing words and transforms it into a Laser Projector to the projection on the wall that is animated and displayed on Pigeon Square, as shown in Fig. 11.

    Fig. 11.
    figure 11

    Morse code data is restored and transformed into an animated projection on the wall

4.2 Data Paradigm Shift in Liquidity of Format

In the above example, through the “Cross-Media IoT System” that we have developed, the blessing text originally entered in the work of “The Morse Code” was transferred to the Morse code, poured into the device and turned into an analog beam. Then the Morse code was retrieved, poured into another work, and then restored to the original blessing text and presented in animation. The entire data, from text, Morse password, analog beam, to the original text to animation, this is what we call “Data Liquidity”. The format of the data is changed during transmission, and the characteristics (input/output) of the data are also changed. In the field of digital art, we developed the Cross-Media IoT System, which breaks through the past when digital artwork data is restricted to be presented in the same work. Our system allows data to perform a “Data Paradigm Shift”.

In all the works of “Tsing Hua Effects 2019 – Cross-Media Technology and Art Fe Festival”, participants can interact with each other by means of “fishing” and “pouring in” the data, which adds interest to the entire festival of science and technology, and creatively allows data to travel across different media and devices Art work.

5 Conclusions

The currently proposed Cross-Media IoT System has achieved 10 outdoor installation art works in the “Tsing Hua Effects 2019 – Cross-Media Technology and Art Festival”. The user uses the mobile phone to input and output data, so that the data can be transferred to different devices for format conversion and role conversion. In the past, many large-scale outdoor art festivals’ interactive installation works are presented in a way preset by the art creators themselves. There are relatively few works for the audience to participate interactively. If there is an interactive form, it can only be achieved through a single. The mode of interaction also lacks the interconnected relationship between several different works. The Cross-Media IoT System we have proposed uses mobile phones to do “scoop” (input) and “pour-in” (output), so that the interactive experience of the works adds a lot of fun. The participant-centered approach makes the dialogue between the participants and the work deeper. At the same time, the liquidity of the data can also make the interactive control of the work more diverse and interesting.

In the future, our Cross-Media IoT System is expected to be applied to more large-scale outdoor public art or technology art performances. The Cross-Media control System can be embedded in works or performers, and allows data to be streamed between different works and performers. Data is appreciated in different media. In addition, our system can also be applied to mobile learning in the field of education, allowing data to be learned in different media. The boundary between teachers and students can be crossed through our system to open various learning opportunities.

Our system breaks through the way users experience humanities and arts. Advances in technology also allow different data to span across different devices. This has become the key to breaking the old norms and cognitions and giving different possibilities for the interactive public art installations of science and art in the past. When data is transformed in different media, deconstructed and reorganized, the data format can be changed to control the production work, and the role of data (input/output) can be streamed between different works. In this the-sis, we want to use our system development and application to interpret the concept of Data Paradigm Shift.