Difference between revisions of "Introduction to Physical Computing"
Line 1: | Line 1: | ||
=what is an interface?= | =what is an interface?= | ||
− | |||
− | + | '''Skin-On Interfaces''' | |
− | |||
− | |||
− | Skin-On Interfaces | ||
Line 17: | Line 13: | ||
− | LINES - an Interactive Sound Art Exhibition | + | '''LINES - an Interactive Sound Art Exhibition''' |
https://www.youtube.com/watch?v=hP36xoPXDnM&ab_channel=VoicesofU | https://www.youtube.com/watch?v=hP36xoPXDnM&ab_channel=VoicesofU |
Revision as of 15:50, 4 April 2021
what is an interface?
Skin-On Interfaces
https://www.youtube.com/watch?v=OuEhqHvE1qU&t=6s&ab_channel=MarcTeyssier
https://marcteyssier.com/projects/skin-on/
LINES - an Interactive Sound Art Exhibition
https://www.youtube.com/watch?v=hP36xoPXDnM&ab_channel=VoicesofU
Puzzle Facade
HäirIÖ: Human Hair as Interactive Material
https://www.youtube.com/watch?v=8JV2D7gJ5HI&ab_channel=HybridEcologies
FingerRing - simple way to play multichannel sound
https://www.youtube.com/watch?v=klD_suu74wc&t=2s&ab_channel=SoundArtistRU
what is the physical interface?
in this contact physical interfaces often associated with physical computing.
Physical Computing is an approach to computer-human interaction design that starts by considering how humans express themselves physically.
In physical computing, we take the human body and its capabilities as the starting point and attempt to design interfaces, both software and hardware, that can sense and respond to what humans can physically do.
Starting with a person’s capabilities requires an understanding of how a computer can sense physical action. When we act, we cause changes in various forms of energy.
Speech generates the air pressure waves that are sound. Gestures change the flow of light and heat in a space. Electronic sensors can convert these energy changes into changing electronic signals that can be read and interpreted by computers. In physical computing, we learn how to connect sensors to the simplest of computers, called microcontrollers, in order to read these changes and interpret them as actions. Finally, we learn how microcontrollers communicate with other computers in order to connect physical action with multimedia displays.
Physical Computing is very similar to our body.
it is all about communication: sensing signals-> computing and sending commands -> and executing.
Physical Computing is very similar to our body.
it is all about communication: sensing signals-> computing and sending commands -> and executing.
For example, we use our ear to listen and then sending signals to our brain to decode and computing what's the responses
and send the command to our vocal cord to output the responses.
the difference is that inside of the human body communication is done through the nervous system.
however, computer communication is done via wire, circuits, and electric signals.
and of course, the human body is much more complex and can't be easily programmed.
if we duplicate what happened above in physical computing it would look like this:
I use the mic to detecting the overall sound volume it is too loud I will start beeping via a speaker.
Let's break it down:
first, the mic converts the air vibrations into electric pulses --> send to processer --> where we program it to send out electrical pulses to the speaker if the received electric signal from the mic reaches certain limits --> the speaker converts the electrical pulses into a beep.
and a lot of sensors are operating at the same principle:
for example, a light sensor LDR(light dependent resistor) decreases electrical resistance with respect to receiving light on the component's sensitive surface.
and a button can close the circuit to let electricity float through or open the circuit to disconnect.