About
Community
Bad Ideas
Drugs
Ego
Erotica
Fringe
Society
Technology
Hack
Hacker Zines
CERT
CHAL
CHAOS
CIAC
CPD
CPSR
CRH
CWD
CuD
CuD/A
EFF
LOL
MOD
Miscellaneous Phreak and Hacker Zines
NIA
RISKS
UXU
register | bbs | search | rss | faq | about
meet up | add to del.icio.us | digg it

Risks Digest 10.52


NOTICE: TO ALL CONCERNED Certain text files and messages contained on this site deal with activities and devices which would be in violation of various Federal, State, and local laws if actually carried out or constructed. The webmasters of this site do not advocate the breaking of any law. Our text files and message bases are for informational purposes only. We recommend that you contact your local law enforcement officials before undertaking any project based upon any information obtained from this or any other web site. We do not guarantee that any of the information contained on this system is correct, workable, or factual. We are not responsible for, nor do we assume any liability for, damages resulting from the use of any information on this site.
From: RISKS Forum <[email protected]>
Date: Wed, 17 Oct 1990 14:18:51 PDT
Subject: RISKS DIGEST 10.52
To: ;@risks-list.ncsl.nist.gov

RISKS-LIST: RISKS-FORUM Digest Wednesday 17 October 1990 Volume 10 : Issue 52

FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
Re: "Pilot error" and Human Factors (P.F. Spelt)
Be careful of what you give away! (M. Freeman)
Re: Technophilia-induced problem at Educom? (Benjamin Ellsworth)
Passwords and chess (Steve Bellovin)
"Expert Systems in the Loop" explained (Martyn Thomas)

The RISKS Forum is moderated. Contributions should be relevant, sound, in good
taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome.
CONTRIBUTIONS to [email protected], with relevant, substantive "Subject:" line
(otherwise they may be ignored). REQUESTS to [email protected].
TO FTP VOL i ISSUE j: ftp CRVAX.sri.com<CR>login anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>; j is TWO digits. Vol summaries in
risks-i.00 (j=0); "dir risks-*.*<CR>" gives directory; bye logs out.
ALL CONTRIBUTIONS ARE CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
The most relevant contributions may appear in the RISKS section of regular
issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.

----------------------------------------------------------------------

Date: Wed, 17 Oct 90 09:50:46 EDT
From: SPELT P F <[email protected]>
Subject: Re: "Pilot error" and Human Factors

In his article posted in RISKS forum of 15th October 1990, Robert
Dorsett made a comment about the A320/human interface which triggered a
"respond NOW" action in me. I am a psychologist working at ORNL in
human factors -- the study of the way people and their machinery (for
work or play) interact. Dorsett said:

>I suggest (again) that the way the airplane interacts with the pilot is
>at LEAST as important as component-wise reliability.

I say: YOU BET!!! My work in human factors (HF) for various projects,
some involving computerized interfaces, some not, has yielded various
comments. The worst kind is: "HF is just common snese." Oh, yea?
Then why have we had SO MANY instances of poorly designed devices
creating "human error", aka "pilot error" in the cases of the A320 and
other aircraft crashes? Another major porblem, also suggested in
Dorsett's posting, is the use of HF consultation. The prevailing modus
operandi has traditionally been to design the system, call in the HF
consultants for evaluation, then have them design a training program to
"train around" the problems designed into the system. Such training
will "work" adequately until a major off-normal event (like TMI), when
the operator is unable to react properly to (interact properly with?)
the mis-designed system.

As we come to design and install more and more complex computerized
interfaces between the machinery and the humans using it, we run the
serious risk of making even greater design errors, many of which will
not show up at all until a major off-normal occurrence comes along. The
introduction of artificial intelligence (AI) into these interfaces adds
an additional dimension along which design errors will propagate. These
concerns have been very adequately covered in the postings on the Aegis
system (Expert System in the Loop postings), although there WAS no ES in
that system.

Several of us at ORNL are involved in research into the use of AI in
"operator associates" for various settings. The potential for using
intelligent icomputerized interfaces is already being explored in a variety of
settings, but many issues remain to be settled, as the Aegis discussion
has highlighted. These issues need BASIC research directed to answer
the questions raised. In this era of increasingly tight budgets,
however, finding support for that basic research is very difficult.
Hrowever, if we don't address these issues, there will continue to be an
increased number of "operator error" accidents analogous to the A320
"pilot error" crashes.

The usual disclaimers apply: These opinions are my own, and do not necessarily
reflect those of ORNL, the Department of Energy, or Martin Marietta Energy
Systems.
email: [email protected]@UMCGATE@OAX
Phil Spelt bldg 6025, ms 6364 POBox 2008 Oak Ridge, TN 37831-6364

------------------------------

Date: 17 Oct 90 10:33:07 EDT
From: <[email protected].com>
Subject: Be careful of what you give away!

>From CompuServe's Online Today Forum Data Libraries:

MONITOR MONTH IN REVIEW
September 1990

FEDS SEIZE COMPUTERS IN KY. TOWN (Sept. 2): Federal agents over the
weekend seized computer equipment from a Nancy, Ky., business office
when it was learned that the computers might contain secret
government files. The owner of Challenger Ltd., Charles Hayes, said
federal marshals came 70 miles from the US attorney's office in
Lexington, Ky., to seize nine computer terminals, a computer memory
device and other equipment which were purchased from the government
for $45."

This shows a Risk from computer equipment you are trying to get rid of. Make
sure you are only getting rid of the equipment, and not giving away copies of
your data! A tape bulk-eraser probably does a nice job on old tapes and hard
drives.

Mark Freeman Microcomupter Technology Specialist/Analyst
CompuServe [email protected].COM

------------------------------

Date: Wed, 17 Oct 90 10:02:19 pdt
From: Benjamin Ellsworth <[email protected]>
Subject: Re: Technophilia-induced problem at Educom?

> The system must have used some kind of voice-recognition algorithm,
> because no human typist that I know could have kept up with the
> speaker at times.

I very strongly doubt this. I would bet a substantial sum of money
that there was a stenographer and not a computer capturing the words.

> The weakness of the voice-recognition system was made painfully
> obvious...

There is RISK of assuming all failures are technologically induced. It
could very well be that the stenographer hired was simply not very
good. The good ones are expensive, and to do "real-time" stenography
takes a good stenographer.

There is a plausible explanation involving computer RISKs however. The
translation from the steno notation to full english words was in all
likelyhood automated. In stenography there are a number of dialects
(usually called theories). Some dialects, especially the older ones,
are not particularly suitable to machine translation. There are also
more than a few translation programs. Between stenographic dialects
and computer translators there can be a significant compatibility
problem. It could be that the stenographer was extremely capable in
the courtroom (where the translations are done off-line by a human),
while at the same time using a style/dialect/theory which was
incompatible with the machine translator.

There has been an interesting interaction between technology and court
recording in the last couple of decades. My mother, for instance, is
in the process of re-learning her stenography in a computer compatible
dialect. It reminds me of pilots who have to learn to fly in a
computer compatible way (training around system weaknesses).

Benjamin Ellsworth [email protected] All relevant disclaimers apply.

------------------------------

Date: Tue, 16 Oct 90 22:46:39 EDT
From: [email protected]
Subject: Passwords and chess

Well, since we're talking about chess, here's a tidbit from Saturday's
NY Times, in an article about the Kasparov-Karpov match:

Trying to meet a noon deadline yesterday for invoking the
time-out, Lajos Portisch, a Hungarian grandmaster who is Mr.
Karpov's second, telephoned Geurt Gijssen, a Dutchman who is
chief arbiter of the match, at 11:53 A.M.

How was the arbiter to be sure it really was Mr. Portisch on
the line?

The Hungarian, who had considered a singing career early in
life -- a fact known to some chess experts -- suggested singing
something in his distinctive voice. Mr. Gijssen agreed, and
Mr. Portisch burst forth with several bars of a Hungarian
song.

The arbiter granted the postponement, although the written
request for the time-out arrived late, at 12:07 P.M.

Sounds like they need some sort of challenge/response scheme; that
password is blown...

--Steve Bellovin

------------------------------

Date: Wed, 17 Oct 90 18:28:08 +0100
From: Martyn Thomas <[email protected]>
Subject: "Expert Systems in the Loop" explained

[email protected] (Randall Davis) writes:

>As for the title of this whole discussion -- "Expert systems in the loop":

>2) There aren't any and there never were any.
> ... ...
>So until otherwise informed, let's be clear about this: it was a problem of
>"Instruments in the loop". That by itself may be worth discussing, but it is
>not and never was an expert system. And it might be interesting to ask, Why
>the rush to label it an expert system?

The original article was mine, and referred to a report of a new research
project in the UK to develop an expert system to advise commanders in
tactical situations which are too complex to analyse without assistance.

This report *explicitly* referred to an expert system. The point of my
original posting was that an expert system which provides advice, in
circumstances where a decision must be made and there is insufficient time
for the commander to analyse the situation him/herself, is effectively
making the decision. Many who followed up agreed with this viewpoint. I
apologise for mentioning the USS Vincennes - it distracted attention from
the major point, and wasted a lot of net bandwidth. So far as I recall,
noone, throughout the discussion, suggested that Aegis is an expert system.

------------------------------

End of RISKS-FORUM Digest 10.52
************************

X-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=-=-=-=-=-X
 
To the best of our knowledge, the text on this page may be freely reproduced and distributed.
If you have any questions about this, please check out our Copyright Policy.

 

totse.com certificate signatures
 
 
About | Advertise | Bad Ideas | Community | Contact Us | Copyright Policy | Drugs | Ego | Erotica
FAQ | Fringe | Link to totse.com | Search | Society | Submissions | Technology
Hot Topics
R. A. Salvatore
Reading childrens books weird?
What are you currently reading?
How often do you read?
Would you let your novel become a movie?
Penguin and Barnes and Noble, fleecing customer?
Chuck Palahniuk
What does reading mean for you?
 
Sponsored Links
 
Ads presented by the
AdBrite Ad Network

 

TSHIRT HELL T-SHIRTS