Bernina Sewing Machines

Monday, 11 July 2016

iPad Repair Hawaii: The Concert Tablet Computer?

The elderly well-dressed crowd grows silent as the lights slowly dim. The conductor walks onto the stage in his white tie and tails to loud applause and salutes the orchestra, the concertmaster, and the audience. The moment has come. The internationally known soloist walks gracefully towards center stage – carrying an iPad?
Concert iPadAdmittedly, this is not yet a common sight but it is no longer a taboo. Artists as well known as the violinist Joshua Bell, the Borromeo string Quartet, Axiom Brass, and the Minor 4th Trombone Quartet have all been known to play concerts replacing their sheet music with tablet computers.
Page changerThere are some advantages, not the least of which is “turning” pages with a Bluetooth foot pedal or with the slight touch of a finger to the screen rather than darting the hand out between phrases to grasp and pull the dog-eared corner of the page (or relying on a page turner to get the timing right). The artist can also vary the music notation size on the screen to his or her taste. Some artists also believe that the smaller size of the tablet compared to a typical music stand allows more of the instrument’s sound to reach the audience.
Some disadvantages however are hard to ignore. Making notations in the margins with a pencil doesn’t work though there are a number of sophisticated applications available now, some which allow use of a stylus in much the same way. The small screen (typically 9-10”Dead iPad diagonally) is seen as limiting by some but the iPad Pro at 12.9” is closer to the size of standard sheet music. The Microsoft Surface Pro 5 slated for release in July of 2016 is expected to measure 13.3”. Some feel that different lighting conditions, varying from the darkness of the orchestra pit to the bright sunlight, render the tablet computer hard to read and certainly getting a “low battery” indicator or a Facebook notification during a concert is an unwelcome distraction. While these problems can be resolved or improved by adjusting the brightness, charging the iPad and turning off the notifications prior to the concert, artists are as vulnerable as the rest of us to oversights and so these issues will continue to be important.
In the world of classical music today there is a climate of urgency to appeal to a younger audience and the Boston Symphony has begun to present “Casual Fridays” where they provide 110 iPads to the spectators to help turn the concerts into exciting multi-media presentations (If you are curious about some of the other ways we use our tablet PC’s, see: Some Interesting Facts about Our Tablet Computers also from iPad Repair Hawaii).
World-renowned conductor Simon Rattle, who recently became the director of the London Symphony Orchestra, envisions a new ultra-modern music hall where young people will feel at home. Tablet computers can hold an almost unlimited repertoire of music with no more physical weight or volume than the tablet itself and it is likely that more and more artists will use it as their primary reference during practice sessions. It may still be awhile however before most will embrace it in live performances. There is one disadvantage that cannot be so easily overcome. While unexpected things can happen to sheet music, it cannot erase itself or suddenly turn black!
If your iPad or other tablet computer suddenly goes black, don’t panic (unless you are on stage performing in front of several thousand people!). Instead, contact MobileREMEDIES®Cellphone/Electronic Repair with stores on Maui and Oahu. They can repair it quickly for less than replacement or insurance costs and give you a 1-year warranty on parts and labor! If they can’t fix your device, there is no charge for the attempt! Go to www.mobileremedies.com or call 1-800-867-5048 to speak to a professional.

Source: http://www.mobileremedies.com/info/ipad-repair-hawaii-the-concert-tablet-computer/

Friday, 6 May 2016

Laptop Repair Hawaii: The Brain-Computer Interface – Part 2

Observations from the Professionals at MobileREMEDIES®
Summary: What if we could communicate directly with our computers without using a physical gesture of any kind: control our computers with our thoughts? That’s just science fiction, right? Not so fast! The professionals at Laptop Repair Hawaii: MobileREMEDIES® present some interesting facts about a sub-group of the Human-Computer Interface (HCI) known as the Brain-Computer Interface or BCI that is doing just that! This article gives us a brief look at this fascinating field of study. In Part 1 we discussed some of the basic elements of any computer interface and looked at the historical development of BCI. Now, in Part 2, we will look at some of the feats already being achieved by BCI in the current state of the art and look ahead to some projections for the future.
            BCI Today

BCI is still in its infancy and so its applications are still few in number. Presently, in early 2016, they can be divided into three main categories: Neuroprosthetics, Self-awareness and Gameplay. It is likely that even in the next few months however, these comments will be outdated and the number and diversity of applications will grow rapidly. As we discussed in Part 1 of this article, each new computer interface allows us to use more complex input and output parameters that software developers can incorporate into progressively more sophisticated programs and devices.

The devices in all of these categories often use one or both of two ingenious ways of determining where on a visual field (i.e. a computer screen) a person’s attention is focused. This is analogous to moving the cursor with a mouse to a specific site on the screen and clicking it to indicate your choice, but it is accomplished using changes in the EEG instead!


One system uses “event related potentials” (EEP’s) which are peaks of activity in the brain waves that occur about 300 milliseconds (called P300) after we recognize uniqueness or meaning in a particular stimulus. This is often called the “oddball paradigm” because it is most pronounced when we see an unexpected or unusual stimulus among many expected or ordinary ones.  Another system uses stimuli that “flicker” at slightly different frequencies.  When our attention is concentrated on a particular stimulus our brain waves also faintly “flicker” at the same rate, allowing us to easily identify our choice.



These changes are called “steady state visual evoked potentials” or SSVEP’s. Other systems are being developed which identify patterns of brainwaves that are typically associated with particular states of mind such as concentration, relaxation and emotional involvement and these will continue grow in sophistication and specificity. 

Neuroprosthetics: In this article we will concentrate more on the second and third categories since they will soon impact all of our lives, but any discussion of BCI without at least a brief treatment of this topic would be deficient. We owe much of the progress in BCI research to the desire to improve quality of life in people with disabilities. So it is fitting that our most extensive research and our most spectacular successes have been and continue to be in this field.



To those who are unable to speak and have no functional use of their extremities, communicating their needs and desires is extremely difficult and even if some code (such as eye blinks or movements) can be established, communication is very slow and rudimentary and offers no functional independence whatsoever and no opportunity to express their personal thoughts and emotions. Empathy for these individuals is a means of putting things into perspective for us all and is a worthwhile experience (see Part 1 of this article or visit: Thought control of robotic arms using the BrainGate system).

Neuroprosthetics (neuro = pertaining to the nervous system, prosthesis = artificial limb or body part) includes any technology that improves function by interfacing a mechanical or electronic device directly with the nervous system, not only with the brain but also with the motor and sensory nerves anywhere in the body. 


Thus, BCI represents only a sub-category of neuroprosthetics.  Most applications to date have concentrated on either spelling/word generation or control of a prosthetic limb.  Classically, techniques have been invasive with implanted electrodes on or within the cortex of the brain since these signals have been of greater amplitude and more reliable.


Within the last few months however, some of the new commercial headsets using dry or saline electrodes have been used for spelling/word generation and speeds as high as 60 characters or 12 words per minute have been achieved. While this is not yet close to our normal communication rates it is a significant improvement over past performance and marks the beginning of an era of non-invasive, affordable alternatives for people with disabilities (To see high speed spelling in real time, follow this video link: PNAS: High-speed spelling with a noninvasive brain-computer interface). Some neuroscience centers are working on robotic devices to help quadriplegic people with their activities of daily living and even strap-in exoskeletons that have internal balance and locomotion systems that allow them to walk over short distances. For the time being these are still too bulky, costly and cumbersome to be of practical use, but the technology is moving forward very quickly and we will speak of it again briefly in the next section.

Self-Awareness: One of the most basic elements of BCI is biofeedback – this means translating some aspect of our own physiology that is typically unconscious or automatic into a measureable signal that in turn permits us (by experimentation) to consciously modify it. For example, it has long been known that awareness of one’s own heart rate can be used as a meditation technique. Some EEG waveforms can be correlated with states of relaxation, attention, awareness, emotional stress etc. Even though some critics of these techniques suggest that motions of the scalp, face and eyes exert such massive influence over these small electrical potentials that they overshadow the brainwaves and are the principle sources of these measurements, it doesn’t actually matter. The concept of biofeedback remains valid as long as the parameters measured are reproducible and reflect true changes in our thoughts. Thus the reasoning used here is that if we can measure our thoughts by any means and become better aware of our mental state, then we can also exert more conscious control over it. 
While all of the commercially available BCI systems have meditation and self-awareness
applications, the Muse “Brain Sensing Headband” by InteraXon, Inc. (a Canadian company) touts this as its principle function and offers 4 active EEG electrodes and 3 reference sensors in a stylish white or black headband that interfaces wirelessly with your cell phone or tablet PC. The sleek headset sells for $249 and with its mobile app claims to “motivate you to change your brain” and “improves how you respond to stress” by providing you with the information you need to manipulate your brain waves.


The Aurora Dream band from Iwinks claims to allow you to reclaim your sleep for $299. It has only 1 EEG channel however, so BCI represents only a small part of its operation. It also monitors heart rhythm, muscle tension, eye movements, body motion, acceleration and orientation as well as provide colored LED lights that can be programmed to shine through your eyelids!


MindRDR is a Google Glass (newest version $1649) app that uses a NeuroSky MindWave Mobile Headset ($99) to create an image overlay with a vertical bar that moves to the top of the visual field when concentration is high and to the bottom of the field when it is low and relaxation is high, according to changes recorded by the single EEG sensor.  A picture is taken when concentration is highest and the user can either keep and send it or reject it, again by either maintaining or reducing concentration without using any gesture or verbalization.


The EPOC ($399) and EPOC+($499) Neuroheadsets and more recently the Insight Brainware®($299) are somewhat more sophisticated headbands from Emotiv, an Australian company. The EPOC devices have 14 EEG sensors (the Insight has 5) with all headsets having 2 reference electrodes creating the potential for greater spatial resolution. The EPOCs use saline soaked pads and the Insight a semi-dry polymer at the points of contact with the scalp. The EPOC+ and the Insight have 9-axis motion, position and acceleration sensors (EPOC has 2-axis sensors) and are equipped to identify up to 8 “emotional states” including:

Instantaneous, excitement, Long-term excitement, Stress, Engagement, Relaxation, Interest and Focus. The EPOCs also measure up to 12 facial expressions including: Blink, Left wink, Right wink, Furrow (frown), Raise brow (surprise), Smile, Clench teeth (grimace), Look left, Look right, Laugh, Smirk (left side) and Smirk (right side). The Insight measures only 7 expressions since it has only 5 sensors. They can all “memorize” up to 4 pre-trained mental commands from a list of 13 labels including push, pull, lift etc. and even “disappear”. The Insight also allows for user-definable commands. (To see an interesting demonstration follow this video link: Tan Le: A headset that reads your brainwaves)


So, shouldn’t each of us own at least one of these devices that we can use to “train” our brains, increase our concentration, improve our self-awareness and sleep better? The answer is “No”, or at least “Not yet”.  Because we feel intuitively that awareness of our mental state will allow us to better control our lives, we are vulnerable to the barrage of “pseudoscientific” marketing claims that a new application or device will “revolutionize” the way we think or learn. Someday we may have the data to back up these claims but presently there is nothing to suggest that learning to manipulate our own brain waves through biofeedback will necessarily make us healthier or better capable of coping with problems in our daily lives. The most significant contribution of this new technology is in making the measurement of the EEG and facial expressions practical and affordable and in offering Software Development Kits that will allow researchers to test hypotheses and develop more sophisticated input and output algorithms.

Gameplay: This is probably where the most widespread development will occur in the near future because the gaming market is already well established and includes by its nature many individuals who are willing to invest in an exciting new experience. No claim of improving one’s health or productivity is necessary even though it is often made!


BCI games are still relatively primitive simply because the technique is new and the degree of on-screen control is not only limited in scope but also in precision.  Some use a combination of hand controls for movement through the virtual world with a mentally performed task required at specific waypoints, usually some form of telekinesis.  For now the typical format is for smartphones, tablets and laptops and the genre is not quite ready for game consoles and hard-core gamers. This will have to wait until the big budget game studios decide to invest in this new technique.


One of the earliest games to be marketed was Mindflex from Mattel ($79) that came out for Christmas in 2009. It was a stand-alone unit that did not require a separate computer. It used a custom NeuroSky headset to interface with a small fan under a plastic obstacle course that would “levitate” a foam ball to varying heights depending on its speed, which
increased with successful concentration. The Mindflex Duel ($145) in 2011 pitted 2 players against each other in “pushing” the ball to one side or the other depending on each player’s level of concentration. In both games, the obstacle course could be customized.


A similar stand-alone game also “levitating” a ball with a fan came out in 2009 from Uncle Milton Industries called the Star Wars Force Trainer ($79) also using NeuroSky technology. It has generated a second version in 2015 called Star Wars Force Trainer II – The Hologram Experience ($120). The new version requires the user to

have a full size tablet PC (iOS or Android) and only the app, a Bluetooth headset and a projection box is provided. Players are coached by Yoda perform various tasks, manipulating objects and “pushing” away foes acquiring up to 10 levels of “Jedi” concentration!
All of the major headset manufacturers have games available for their devices. NeuroSky has almost 150 apps available on its website with about one third of them classified as games. Most games are free or sell for less than $10 though one of the most popular, “Throw Trucks With Your Mind”, costs $25. Emotiv has 34 apps available on its website and about half of those are games. One is called Arena ($15) and allows a player to
create and shoot fireballs at an enemy or 2 players to shoot them at each other using the “push” command. Another called MindDrone ($15) allows you to fly an AR.Drone with your brain waves and facial expressions. An interesting app from Emotiv that is not actually a game but certainly sounds like fun is called EmoLens ($40). This app automatically indexes Flickr photos according to your emotional response and facial expressions and allows you to create slide shows to fit your mood!

In short, though most of the existing apps are limited in scope and perhaps too simplistic to hold our attention over prolonged periods, the technology is poised to open a whole new era in the human computer interface with no end in sight.


The Future of BCI

As we noted in Part 1, the amplitude of the EEG signal is so small when measured at the scalp that it will always be subject to artifacts including micro motion of the electrodes and stray electrical signals from the underlying muscles as well as those from the many electromagnetic devices that surround us in our daily lives (i.e. cell phones, computers, televisions, electrical transfoEEG BLINKSrmers etc.). Any of these headsets left sitting on a desk will transmit signals similar in amplitude to the EEG without being in contact with anything but the surrounding air! With a signal-to-noise ratio that poor, much of our success in isolating and interpreting these waves in the future will come from our ability to instantaneously measure and subtract “noise” from the immediate environment. This is similar to the concept used in noise-cancelling headphones but is much more complex since it involves multiple physical and electrical sources.  Until we can isolate and quantify other parameters that come directly from brain function, noise reduction will remain a central theme in the future of BCI.


Of great importance however is that some of the “noise” that interferes with measurement of the EEG signal on the scalp, especially that generated by our facial expressions, comes from specific patterns of muscle stimulation that are subtle but reliable indicators of our thought processes and as such are measurable parameters that can be used by our computers to interpret our intentions. Even when we try to maintain a “poker face” by inhibiting overt facial expressions, signals to the appropriate muscles and microscopic contractions are still present and can be measured (except in only very rare neurological conditions). Though they are not part of the direct brain-computer interface, these signals along with other biometric parameters (i.e. skin conductance, heart rate, blood pressure etc.) are easier to measure and may ultimately be found more reliable indicators of our thoughts than the EEG itself.

Thus, we can predict that for the foreseeable future, BCI will likely remain only one element in many “hybrid” systems that will make up the human computer interface. It will appear to us and to an Drone BCIobserver that we are controlling our computers only with our minds when in fact complex algorithms using data from multiple biometric sources will come into play. Until recently, the massive amount of real-time data generated and the sophisticated processing required to use such a system was a formidable limiting factor. The advent of widespread high-bandwidth wireless technology and the exponential increase in the speed, power and storage capacities of our computing systems have now made this realistic. (If you would like to read more about this progression see: “The Changing Roles of our Desktops, Laptops, Tablets and Smartphones–Part 3, also from Laptop Repair Hawaii:MobileREMEDIES®)


Communication remains at the center of our progress and historically we have relied on spoken and written language. It is unlikely that we will be able to bypass language in the near future so converting our thoughts into text will continue to be a high priority. New intuitive input algorithms will not only help disabled people communicate effectively but will eventually replace our keyboards. The increasing availability of non-invasive and affordable sensors is now bringing research into the mainstream and into the hands of many creative software developers. This will open up many new horizons.


High word generation speeds will likely come from apps that anticipate the next word from the existing text (similar to Swift Key and Quick Type) but are also based on an individual’s emotional status, facial expressions and other biometric parameters. Since people differ greatly from each other, the most accurate and rapid algorithms will depend on the application “learning” and accumulating phrases and their associated contexts for each individual.  Progress will be very rapid in aiding the disabled since even small improvements in word generation speeds will be highly appreciated. Replacing our keyboards however is a different matter and represents a much more formidable task. Word generation speed and accuracy in BCI and/or hybrid systems will have to undergo considerable development and improvement before it can compete with well-established and highly efficient keyboards.


Other aspects of neuroprosthetics are currently flourishing and are poised to benefit tremendously from the addition of BCI and hybrid control systems, potentially improving quality-of-life and giving greater independence to people with disabilities (If you wish to understand better, follow this video link: Amanda Boxtel: Humanizing Machines with Functionality, Design, & Beauty). The most spectacular of these are the exoskeletons mentioned above that allow patients with paralysis to walk by incorporating sophisticated balance and control systems into the devices. Stronger and lighter structural materials combined with smaller and more powerful motors will continue to open new horizons and give new hope to people with spinal cord injuries and neuromuscular disease. Bionic replacement or supplementation of damaged body parts will be routine and highly effective in the next few decades partly due to BCI technology.


It is worthwhile making some observations about the scope of BCI since some people express their concerns that in the future it could be used inappropriately to invade our privacy. As was explained in Part 1, the EEG at its best is an average signal from several thousands or millions of neurons modified by the various tissues surrounding them. It can never be specific enough to reveal individual thoughts or ideas and can only give information about general trends such as mood, level of attention, relaxation etc. Even these vary widely enough between individuals that without baseline observations they are often unreliable. Thus the type of information available from the EEG alone will always be somewhat limited and relatively non-specific. So it is unlikely that our personal information, private thoughts or business secrets will be at risk.


This said, the technology is relatively good at determining classes of emotional responses to particular stimuli. This is what allows BCI (in conjunction with facial expressions) to classify images into a slide show that will fit our mood (see Games above) or arrange our iTunes files into “our best” playlists. This is where the greatest risk of loss of privacy will occur.  Just as Internet shopping sites tabulate our preferences in order to individualize their advertising, others could use this information about our emotional responses. The technique does not work backwards (influencing emotions or choices by applying electrical signals to the scalp – though it does if electrodes are placed directly on the cortex of the brain!) so there is little chance of “mind control” through our headsets.

We have looked at some of the basic elements of the brain-computer interface in the past, present and future. It is a science that is still in its infancy but recent developments in making EEG measurement practical and affordable have brought it into the forefront and we will certainly hear much more about it in the months and years to come.

At MobileREMEDIES®, with locations on Maui and Oahu and mail-in service from anywhere in the world, you get a free diagnostic evaluation and an estimate of the repair costs and time required. You also always get a 1-year warranty on parts and service. If they can’t fix your device, you pay nothing for the attempt! In addition to laptops and desktop computers, they also repair cell phones, iPads and all other tablet PC’s as well as iPods and game systems (Xbox, PlayStations, Wii, etc.). MobileRemedies_logo01They build custom computers for gamers and other high demand users, recover lost data, provide web services for individuals and small businesses, buy broken devices for cash or in-store credit and sell refurbished devices with a 1-year warranty, similar to a manufacturer’s warranty on a new device. You can find them at Laptop Repair HawaiiiPhone repair HawaiiiPad repair HawaiiiPod repair Hawaii,Data Recovery HawaiiCustom Computers Hawaii and Xbox repair Hawaii. You may also go to www.mobileremedies.com or call 1-800-867-5048.
The End

Source: http://www.mobileremedies.com/info/laptop-repair-hawaii-the-brain-computer-interface-part-2/

Monday, 28 March 2016

Laptop Repair Hawaii: The Brain-Computer Interface – Part 1

Observations from the Professionals at MobileREMEDIES®

Introduction

We have learned to interact with our computers by manipulating various devices through the years including the keyboard, the mouse and trackball, the stylus and graphics tablet and more recently the touchscreen, using only the tip of our finger.  We now have our computers identify us by our fingerprints, detect our heart rates and even sense how far we have walked or how many steps we have climbed. We call this the Human-Computer Interface or HCI. We continue to be more and more creative and sophisticated in how we interact with our computers and there appears to be no end in sight as we begin using signals from our own bodies as a direct link. 

What if we could communicate directly with our computers without using a physical gesture of any kind: control our computers with our thoughts? That’s just science fiction, right? Not so fast! The professionals at Laptop Repair Hawaii: MobileREMEDIES® present some interesting facts about a sub-group of HCI known as the Brain-Computer Interface or BCI that is doing just that! This article gives us a brief look at this fascinating field of study. In Part 1 we will discuss some of the basic elements of any computer interface and look at the historical development of BCI. In Part 2 we will look at some of the feats already being achieved by BCI in the current state of the art and look ahead to some projections for the future.


Human-Computer Interface Basics

In order to understand BCI we need to review some basic principles. To communicate with our computers we use some aspect of our behavior to modify a measurable “variable” that the computer can then use as an indication of our intention in a well-defined context.  Pressing a particular key or set of keys on the keyboard is a direct and simple example. Since the computer has the potential to analyze and process its own input signals however, we can also use much more sophisticated and creative ways to indicate our choices. We can look at this as a progression by way of several examples.


Two-dimensional motions of a mouse or trackball can be translated into the motion of a cursor on a computer screen and we can indicate our choice by “clicking” it when it reaches the desired virtual location. “Dragging” the cursor with the mouse can allow us to select a specific 2-dimensional area of the screen, which in turn requires additional processing of the “input” data. Using the conductive properties of our skin to open a menu or select an option directly on a touchscreen brings us one step closer by making the screen its own input device, bypassing the keyboard and mouse but again adding even more sophistication to the processing required. Having the computer sense the motion and direction of our eyes and use it to determine the exact spot on the screen that is the focus of our gaze and to open the corresponding menu is an example of the next step. Any parameter that can be sensed and manipulated can be used in this interaction.


In order to complete the “interface” there must be a way for us to verify that our intervention has had its desired effect. Most often this is accomplished by some change in the image on the computer screen but this should also be thought of as a progression of possibilities. The response we perceive may be more creative such as a sound we can hear, a slight vibration on our wrist under our “smartwatch” or the visible motions of an external device such as a prosthetic limb. This response is known as “feedback” and can involve any stimulus or parameter that we can perceive. The relationship is called a “feedback loop” and it is at the core of how we as organisms learn to manipulate our environment. Even neurons cultured in a dish can “learn” using a feedback loop. These same interactions in the physical world allow us to create and use tools. The potentially limitless capacity of our computers to process data makes them unique among our tools and allows us the immense freedom to
create almost any interface that we can imagine (if you would like to read more about the unique status of our computers as tools see: “The Changing Roles of our Desktops, Laptops, Tablets and Smartphones–Part 3, also from Laptop Repair Hawaii: MobileREMEDIES®). 

This “evolutionary” process, building progressively more complex input and output algorithms, opens up a realm of possibilities so vast that it can only end with our thoughts themselves becoming the signals! The only stipulation in this reasoning is that there must be some measurable and reproducible indicator of our thoughts. The complexity of the indicator itself is only a temporary barrier.

                                                           Historical Background:


An English scientist, Richard Caton, first discovered in 1875 that electrical impulses were emitted from the brains of rabbits and monkeys and published a study entitled “The Electric Currents of the Brain”. His methods were mostly invasive using electrodes placed on or in the cerebral cortex and so generated little interest, except for physiologists and science fiction writers, until Hans Berger, a German Psychiatrist, found in 1924 that he could reliably record “brain waves” non-invasively by placing electrodes on the scalps of humans. He called his technique “electroencephalography” or EEG. He was able to identify different patterns or “rhythms” that were present in normal brains in different states of consciousness and in response to various stimuli and to describe changes that occurred in some brain conditions such as epilepsy.


The EEG gradually became incorporated into modern medicine but its usages were always somewhat limited by the difficulties encountered in measuring and interpreting these tiny complex waveforms with the existing technology. Classically, this required extensive preparations including abrading the most superficial layer of skin at each measurement site, attaching multiple electrodes between strands of hair all over the scalp using gels, pastes and straps. Measured from the skin, these signals were only 10 to 100 microvolts and had to be amplified 1,000 to 100,000 times to be usable. Background noise and electromagnetic interference, micro-motions of the scalp from blinking and inadvertent facial expressions and irregularities at the skin/electrode contact sites, often generated signals of similar or greater amplitude, making it difficult to obtain reliable or reproducible results.  


To complicate things further, the blood, the membranes and the cerebrospinal fluid in and around the brain, as well as the bones of the skull and the tissue of the scalp, “smear” and attenuate the signals averaging the output from thousands or millions of neurons giving the EEG poor spatial resolution (the ability to distinguish where in the brain the signal originated). Also, since signal strength falls off exponentially, activity from the deeper tissues below the cortex never even makes it to the scalp. The waveforms themselves are extremely complex and even experienced electroencephalographers had difficulty interpreting results.


It’s not surprising then that the EEG was slow to develop as a candidate for HCI over many decades and remained a “medical” phenomenon. Early BCI research was centered almost exclusively on helping patients with devastating brain or spinal cord injuries or diseases to restore some basic function to aid in activities of daily living. This has rightfully continued to be a main area of development and though the thrust of this article is more about presenting BCI as a logical progression of the human-computer interface for our convenience, I strongly recommend that you take a few minutes to see it in another context and appreciate what a tool it represents in people whose means of interaction are severely limited (as in this woman who had been unable to feed herself for more than a decade: https://www.youtube.com/watch?v=QRt8QCx3BCo). 

One of the pioneers of this research and the first to use the term “Brain-Computer Interface” was Professor Jacques Vidal, an electrical and nuclear engineer for the Brain Research Institute at UCLA, in the early 1970’s. He wrote of the EEG: “Can these observable electrical brain signals be put to work as carriers of information in man-computer communication or for the purpose of controlling such external apparatus as prosthetic devices or spaceships? Even on the sole basis of the present states of the art of computer science and neurophysiology, one may suggest that such a feat is potentially around the corner.”


The concept was visionary in 1973. His efforts rapidly attracted the interest and funding of the Defense Advanced Research Projects Agency (DARPA), a government agency founded in the late 1950’s to help the US win the “space race”. DARPA was interested in helping severely wounded Vietnam War veterans and in improving communication and training in the military setting. Desktop computing was still in its infancy and large mainframe machines were required. The problems with voltage measurements from the scalp were still formidable and truly reliable signals required implantation of electrodes on or within the cortex. The more deeply implanted electrodes caused other brain damage and usually became non-functional over time due to scar tissue. Such surgeries could only be justified in people with few other communication options and the costs were astronomical and prohibitive outside of the research setting. 


So far, we have only discussed the EEG and have implied that these signals are the ONLY measurable indicators of brain activity. There are in fact several other known indicators that potentially give much more information about our thought processes because they include a high degree of spatial resolution. These include
(among others) magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). The problem is that to measure these parameters one must have access to multimillion-dollar machines, laboratories and highly skilled personnel. Oscillations in the pupil size of the eye have recently been used to indicate interest in various contextual images but it is too soon to know if this will be useful in more than simple item selection. Other measureable indicators will almost certainly come to light in the future but for now brain waves are the only practical and affordable game in town!


So, at least for the time being, to make BCI an area of mainstream research a reliable and simple surface measurement of the EEG is a critical factor.  While the problem is far from being solved, major advances have been made in just the last few years that have brought it into the realm of practicality.


The main issues have been establishing and maintaining reliable conductive contact points with the skin on a hairy surface without messy gels and pastes and accommodating the many different shapes and sizes of people’s heads in a comfortable and “non-overwhelming” way. “Dry” electrodes of many different types are being developed and tested but one of the most promising is a soft conductive polymer pad surrounded by a conductive fabric that can establish a uniform surface contact by surrounding and pushing beyond individual hairs. Meanwhile, modern materials with just the right amount of flexibility are being used to fabricate stylish, adjustable and comfortable headgear.

Data processing has also changed since the 1970’s! Our computers became exponentially more powerful, progressively smaller and more portable, and particularly cheaper and more accessible (If you are interested in the evolution of our computers see: “The Changing Roles of our Desktops, Laptops, Tablets and Smartphones”, also from Laptop Repair Hawaii: MobileREMEDIES®). These more sophisticated devices combined with wireless interconnectivity have been able to measure, quantify and cancel background noise and interference as well as analyze and identify complex patterns within signals. They have helped neuroscientists to better understand and interpret not only brain waves but many other underlying brain functions including plasticity in certain brain segments and even spontaneous wiring and communication of individual neurons in cell cultures (if you would like to glimpse a fascinating topic related to BCI see: Robot with a Biological Brain)! In short, our technological advances have allowed us to turn “the corner” that Vidal spoke of in 1973 and bring BCI into practical reality.


At MobileREMEDIES®, with locations on Maui and Oahu and mail-in service from anywhere in the world, you get a free diagnostic evaluation and an estimate of the repair costs and time required. You also always get a 1-year warranty on parts and service. If they can’t fix your device, you pay nothing for the attempt! In addition to laptops and desktop computers, they also repair cell phones, iPads and all other tablet PC’s as well as iPods and game systems (Xbox, PlayStations, Wii, etc.). They build custom computers for gamers and other high demand users, recover lost data, provide web services for individuals and small businesses, buy broken devices for cash or in-store credit and sell refurbished devices with a 1-year warranty, similar to a manufacturer’s warranty on a new device. You can find them at Laptop Repair Hawaii, iPhone repair Hawaii, iPad repair Hawaii, iPod repair Hawaii, Data Recovery Hawaii, Custom Computers Hawaii and Xbox repair Hawaii. You may also go to www.mobileremedies.com or call 1-800-867-5048.
In Part 2 we will look at some of the feats already being achieved by BCI in the current state of the art and look ahead to some projections for the future.

(Click here to read Part 2)