A touchscreen is a visual display that enables the user to control a device through single or multiple finger touches to the screen. A user can give input or control the information processing system through simple or multi-touch gestures by touching the screen with a stylus or with a single or multiple finger touches.
Touchscreen technology has changed the way that humans interact with computers as the touchscreen technology allows the user to interact directly with what is displayed, rather than using a mouse, touchpad, or any other intermediate device (other than a stylus, which is optional for most modern touchscreens) they also can be found on a range of digital devices, such as tablet computers. User interfaces designed for touchscreen computers are often seen as more difficult to use than older technology.
Touchscreens are common in devices such as game consoles, personal computers, tablet computers, electronic voting machines, and smartphones. They can also be attached to computers or, as terminals, to networks. They also play a prominent role in the design of digital appliances such as personal digital assistants (PDAs)es and some books (E-books).
The popularity of smartphones, tablets, and many types of information appliances is driving the demand and acceptance of common touchscreens for portable and functional electronics. Touchscreens are found in the medical field and in heavy industry, as well as for automated teller machines (ATMs), and kiosks such as museum displays or room automation, where keyboard and mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based firmware have been made available by a wide array of after-market system integrators, and not by display, chip, or motherboard manufacturers. Display manufacturers and chip manufacturers worldwide have acknowledged the trend toward acceptance of touchscreens as a highly desirable user interface component and have begun to integrate touchscreens into the fundamental design of their products.
Touch Screen At Phones |
Touch Screen At a Tablet - Computer |
History Of Touchscreens
E.A. Johnson described his work on capacitive touchscreens in a short article published in 1965 and then more fully—with photographs and diagrams—in an article published in 1967. The applicability of touch technology for air traffic control was described in an article published in 1968. Bent Stumpe, engineer from CERN, developed a transparent touchscreen in the early 1970s, based on Stumpe's work at a television factory in the early 1960s. Then manufactured by CERN, it was put to use in 1973. A resistive touchscreen was developed by American inventor George Samuel Hurst, who received US patent #3,911,215 on October 7, 1975. The first version was produced in 1982.
E.A. Johnson ( On the Right Side) |
In 1972, a group at the University of Illinois filed for a patent on an optical touchscreen that became a standard part of the Magnavox Plato IV Student Terminal. Thousands were built for the PLATO IV system. These touchscreens had a crossed array of 16 by 16 infrared position sensors, each composed of an LED on one edge of the screen and a matched phototransistor on the other edge, all mounted in front of a monochrome plasma display panel. This arrangement can sense any fingertip-sized opaque object in close proximity to the screen. A similar touchscreen was used on the HP-150 starting in 1983; this was one of the world's earliest commercial touchscreen computers. HP mounted their infrared transmitters and receivers around the bezel of a 9" Sony Cathode Ray Tube (CRT).
In 1985, Sega released the Terebi Oekaki, also known as the Sega Graphic Board, for the SG-1000 video game console and SC-3000 home computer. It consisted of a plastic pen and a plastic board with a transparent window where the pen presses are detected. It was used primarily for a drawing software application.
The Terebi Oekaki |
Touch-sensitive Control-Display Units (CDUs)were evaluated for commercial aircraft flight decks in the early 1980s. Initial research showed that a touch interface would reduce pilot workload as the crew could then select waypoints, functions and actions, rather than be "head down" typing in latitudes, longitudes, and waypoint codes on a keyboard. An effective integration of this technology was aimed at helping flight crews maintain a high-level of situational awareness of all major aspects of the vehicle operations including its flight path, the functioning of various aircraft systems, and moment-to-moment human interactions.
In the early 1980s, General Motors tasked its Delco Electronics division with a project aimed at replacing an automobile's non essential functions (i.e. other than throttle, transmission, braking and steering) from mechanical or electro-mechanical systems with solid state alternatives wherever possible. The finished device was dubbed the ECC for "Electronic Control Center", a digital computer and software control system hardwired to various peripheral sensors, servos, solenoids, antenna and a monochrome CRT touchscreen that functioned both as display and sole method of input. The ECC replaced the traditional mechanical stereo, fan, heater and air conditioner controls and displays, and was capable of providing very detailed and specific information about the vehicle's cumulative and current operating status in real time. The ECC was standard equipment on the 1985–89 Buick Riviera and later the 1988–89 Buick Reatta, but was unpopular with consumers partly due to the technophobia of some traditional Buick customers, but mostly because of costly to repair technical problems suffered by the ECC's touchscreen which being the sole access method, would render climate control or stereo operation impossible.
Multi-touch technology began in 1982, when the University of Toronto's Input Research Group developed the first human-input multi-touch system, using a frosted-glass panel with a camera placed behind the glass. In 1985, the University of Toronto group including Bill Buxton developed a multi-touch tablet that used capacitance rather than bulky camera-based optical sensing systems (see History of multi-touch).
In 1986, the first graphical point of sale software was demonstrated on the 16-bit Atari 520ST color computer. It featured a color touchscreen widget-driven interface. The ViewTouch point of sale software was first shown by its developer, Gene Mosher, at Fall Comdex, 1986, in Las Vegas, Nevada to visitors at the Atari Computer demonstration area and was the first commercially available POS system with a widget-driven color graphic touchscreen interface.
In 1987, Casio launched the Casio PB-1000 pocket computer with a touchscreen consisting of a 4x4 matrix, resulting in 16 touch areas in its small LCD graphic screen.
Casio's First Touchscreen Technology, The PB - 1000 |
Until 1988 touchscreens had the bad reputation of being imprecise. Most user interface books would state that touchscreens selections were limited to targets larger than the average finger. At the time, selections were done in such a way that a target was selected as soon as the finger came over it, and the corresponding action was performed immediately. Errors were common, due to parallax or calibration problems, leading to frustration. A new strategy called "lift-off strategy" was introduced by researchers at the University of Maryland Human – Computer Interaction Lab and is still used today. As users touch the screen, feedback is provided as to what will be selected, users can adjust the position of the finger, and the action takes place only when the finger is lifted off the screen. This allowed the selection of small targets, down to a single pixel on a VGA screen (standard best of the time).
Sears et al. (1990) gave a review of academic research on single and multi-touch human–computer interaction of the time, describing gestures such as rotating knobs, adjusting sliders, and swiping the screen to activate a switch (or a U-shaped gesture for a toggle switch). The University of Maryland Human – Computer Interaction Lab team developed and studied small touchscreen keyboards (including a study that showed that users could type at 25 wpm for a touchscreen keyboard compared with 58 wpm for a standard keyboard), thereby paving the way for the touchscreen keyboards on mobile devices. They also designed and implemented multitouch gestures such as selecting a range of a line, connecting objects, and a "tap-click" gesture to select while maintaining location with another finger.
In 1990 the University of Maryland Human – Computer Interaction Lab demonstrated a touchscreen slider, which was later cited as prior art in the lock screen patent litigation between Apple and other touchscreen mobile phone vendors (in relation to U.S. Patent 7,657,849).
In c. 1991–92, the Sun Star7 prototype PDA implemented a touchscreen with inertial scrolling. In 1993, the IBM Simon—the first touchscreen phone—was released.
IBM Simon, The FIRST touchscreen phone to feature software applications |
An early attempt at a handheld game console with touchscreen controls was Sega's intended successor to the Game Gear, though the device was ultimately shelved and never released due to the expensive cost of touchscreen technology in the early 1990s. Touchscreens would not be popularly used for video games until the release of the Nintendo DS in 2004. Until recently, most consumer touchscreens could only sense one point of contact at a time, and few have had the capability to sense how hard one is touching. This has changed with the commercialization of multi - touch technology.
The First Touchscreen Phone
Do you know? They first Touchscreen phone that can feature software applications is the IBM Simon, They are made in 1992 by the IBM company. A few competitors came out in the early 90's, but most mobile devices with touchscreens were more like PDA's.
A video about the "First touchscreen Phone"
How Touchscreen Works
Different kinds of touchscreen work in different ways. Some can sense only one finger at a time and get extremely confused if you try to press in two places at once. Others can easily detect and distinguish more than one key press at once. These are some of the main technologies
A video of "How Touchscreen Works"
Resistive Touchscreen
A resistive touch screen panel ( Currently the most popular ) is coated with a thin metallic electrically conductive and resistive layer that causes a change in the electrical current which is registered as a touch event and sent to the controller for processing. Resistive touch screen panels are generally more affordable but offer only 75% clarity and the layer can be damaged by sharp objects. Resistive touch screen panels are not affected by outside elements such as dust or water. When you press on the screen, you force the polyester to touch the glass and complete a circuit—just like pressing the key on a keyboard. A chip inside the screen figures out the coordinates of the place you touched.When you press a resistive touchscreen, you push two conducting layers together so they make contact, a bit like an ordinary computer keyboard |
Capacitive Touchscreen
A capacitive touch screen panel is coated with a material that stores electrical charges. When the panel is touched, a small amount of charge is drawn to the point of contact. Circuits located at each corner of the panel measure the charge and send the information to the controller for processing. Capacitive touch screen panels must be touched with a finger unlike resistive and surface wave panels that can use fingers and stylus but, It can be touched at more than one place at once.In a capacitive touchscreen, the whole screen is like a capacitor. When you bring your finger up close, you affect the electric field that exists between the inner and outer glass |
Surface Accoustic Wave Touchscreen
This technology surprisingly uses Sound Waves instead of anything else, It uses ultrasonic waves that pass over the touch screen panel. When the panel is touched, a portion of the wave is absorbed. This change in the ultrasonic waves registers the position of the touch event and sends this information to the controller for processing. Surface wave touch screen panels are the most advanced of the types, but they can be damaged by outside elements.A surface-acoustic wave screen is a bit like an infrared screen, but your finger interrupts high-frequency sound beams rippling over the surface instead of invisible light beams |
Infrared Touchscreen
Infrared touch screens are based on light-beam interruption technology. Instead of an overlay on the surface, a frame surrounds the display. The frame has light sources, or light emitting diodes (LEDs) on one side and light detectors on the opposite side, creating an optical grid across the screen.When an object touches the screen, the invisible light beam is interrupted, causing a drop in the signal received by the photo sensors.It is costly but has an advantage of being accurate.When your fingers move up close, they break invisible beams that pass over the surface of the screen between LEDs on one side and photocells on the other. |
Near Field Imaging Touchscreen
Have you noticed how an old-style radio can buzz and whistle if you move your hand toward it? That's because your body affects the electromagnetic field that incoming radio waves create in and around the antenna. The closer you get, the more effect you have. Near field imaging (NFI) touchscreens work a similar way. As you move your finger up close, you change the electric field on the glass screen, which instantly registers your touch. Much more robust than some of the other technologies, NFI screens are suitable for rough-and-tough environments (like military use). Unlike most of the other technologies, they can also detect touches from pens, styluses, or hands wearing gloves.With a near-field imaging screen, small voltages are applied at the corners, producing an electric field on the surface. Your finger alters the field as it approaches. |
Optical Imaging Touchscreen
Optical touchscreens are a relatively modern development in touchscreen technology, in which two or more image sensors are placed around the edges (mostly the corners) of the screen. Infrared back lights are placed in the camera's field of view on the other side of the screen. A touch shows up as a shadow and each pair of cameras can then be pinpointed to locate the touch or even measure the size of the touching object (see visual hull). This technology is growing in popularity, due to its scalability, versatility, and affordability, especially for bigger units.It detects touch by using infrared imaging sensors |
Dispersive signal technology
Introduced in 2002, by 3M, this system uses sensors to detect the piezoelectricity in the glass that occurs due to a touch. Complex algorithms then interpret this information and provide the actual location of the touch.The technology claims to be unaffected by dust and other outside elements, including scratches. Since there is no need for additional elements on screen, it also claims to provide excellent optical clarity. Also, since mechanical vibrations are used to detect a touch event, any object can be used to generate these events, including fingers and stylus. A downside is that after the initial touch the system cannot detect a motionless finger.
This is how the dispersive signal touchscreen works
Light Pen Touchscreen
Light pens were an early form of touchscreen technology, but they worked in a completely different way to modern touchscreens. In old-style computer screens, the picture was drawn by an electron beam that scanned back and forth, just like in a cathode-ray tube television. The pen contained a photoelectric cell that detected the electron beam as it passed by, sending a signal to the computer down a cable. Since the computer knew exactly where the electron beam was at any moment, it could figure out where the pen was pointing. Light pens could be used either to select menu items or text from the screen (similar to a mouse) or, as shown in the picture here, to draw computer graphics.Drawing on a screen with a light pen back in 1973. the light pen is actually connected to the computer by a long electric cable |
Construction Of A Touchscreen
There are several principal ways to build a touchscreen. The key goals are to recognize one or more fingers touching a display, to interpret the command that this represents, and to communicate the command to the appropriate application.In the most popular techniques, the capacitive or resistive approach, there are typically four layers:
- Top polyester coated with a transparent metallic conductive coating on the bottom
- Adhesive spacer
- Glass layer coated with a transparent metallic conductive coating on the top
- Adhesive layer on the backside of the glass for mounting.
Dispersive-signal technology which 3M created in 2002, measures the piezoelectric effect—the voltage generated when mechanical force is applied to a material—that occurs chemically when a strengthened glass substrate is touched.
There are two infrared-based approaches. In one, an array of sensors detects a finger touching or almost touching the display, thereby interrupting light beams projected over the screen. In the other, bottom-mounted infrared cameras record screen touches.
In each case, the system determines the intended command based on the controls showing on the screen at the time and the location of the touch.
Fingerprints
One of Touchscreens most common problem is it can suffer from the problem of fingerprints on the display. This can be mitigated by the use of materials with optical coatings designed to reduce the visible effects of fingerprint oils, or oleophobic coatings as most of the modern smartphones, which lessen the actual amount of oil residue (which includes alcohol), or by installing a matte-finish anti-glare screen protector, which creates a slightly roughened surface that does not easily retain smudges.Fingerprints on a Tablet |
A video of " How to clean a touch screen"
Advantages of using Touchscreen
-Touchscreen devices usually have more simple user interfaces Ex. Apple Apps
-Having less or not buttons means that you can put more effort into having a big screen
-For the people worried about hygiene, most devices are easy to clean, some are even dirt,
dust and grease resistant
-For people new or
uncomfortable with normal desktops, touchscreens are easy to use helping
more people get used to using computers
Touchscreen technology can help us do our daily work more efficiently.
Disadvantages of using Touchscreen
-The screen has to be big enough to be able to touch the buttons without missing
-Having a big bright screen and needing massive computing power to run this means a very low battery life
-In direct sunlight it is much less effective and most of the time very difficult to read the screen
-If a touchscreen device were to
crash the whole screen would be unresponsive, and because of the lack of
buttons recovering it would be very difficult
-The screens will get very dirty
-They are usually used in smartphones and can make people get addicted because they can be used with ease.
-They usually cost more than ordinary devices
One of Touchscreen drawback is that they are usually used in Smartphones and can make people addicted because they are portable and can be used with ease. |
Overall
The great thing about touchscreen technology is that it's incredibly easy for people to use. Touchscreens can display just as much information (and just as many touch buttons) as people need to complete a particular task and no more, leading people through quite a complex process in a very simple, systematic way. That's why touchscreen technology has proved perfect for public information kiosks, ticket machines at railroad stations, electronic voting machines, self-service grocery checkouts, military computers, and many similar applications where computers with screens and keyboards would be too troublesome to use.
Thanks For Reading!