Solved by verified expert:I’m in need of help with writing a paper for one of my Computer science courses. I have to recap the details of the case a, Identify the central stakeholders in the case, Identify the technical/professional problem in the case, Identify the Ethical problem or problems in the case, and Solve the technical and ethical problems using both technical and Ethical standards. I also have to use 3 out of the 5 rules of computing in my paper, I will attach the rules. The paper is due by 5:30pm. Help would be greatly appreciated and ofcourse you’ll be compensated for helping :)I will also attach the guidelines for the assignment as well as the case study.
ai_boy.jpg
2nd_case_study_assignment.doc
five_rules_for_computing.pdf
Unformatted Attachment Preview
2nd Case Study assignment
Steps in Ethical Case Analysis
1. Get the facts straight. Review the case. Briefly recap the details of the case at the
beginning of your paper.
2. Identify the central stakeholders in the case.
3. Identify the technical/professional problem in the case.
4. Identify the Ethical problem or problems in the case.
5. Solve the technical and ethical problems using both technical and Ethical
standards. Analyze the case from 3 perspectives, use 1 ethical principle for each
perspective.
6. In addition analyze the case from a professionals perspective (CS, IS, or IT).
Use 3 Rules from Miller et al., 5 rules in your analysis.
6. Will your solution to the problem withstand criticism from the perspectives of both
a variety of Ethical principles and Professionals in your field?
7. What recommendations can you make about the problems in the case based upon
your ethical analysis?
When you construct your analysis be sure and remember that we are assigning a 4-5
page analysis. With this length limitation it is important to realize that you will
probably only be able to look at the problems in the case from four stakeholder
perspectives. If you try to analyze every stakeholder perspective you will probably
exceed the length limitation.
Grading and Evaluation of Individual Papers
1. What are the objectives of the papers?
A.
i. To become sensitized to the ethical issues in Engineering and Information
Technology.
ii. Learn how to analyze a case.
iii. Learn how to identify the major stakeholders in a case.
iv. Learn how to identify the technical problems in a case.
v. Learn how to identify the ethical problems in a case.
vi. Learn how to apply ethical principles to a case.
vii. Learn how to make recommendations in a case based upon ethical analysis.
B.
a. Critical thinking
i. Did you identify and focus on the crucial material and facts in the case?
ii. Did you support claims you make about the case with facts?
iii. Did you think about the case from a variety of stakeholder perspectives?
b. Ethical analysis
i. Did you identify the central Ethical problem(s)
ii. Did you conduct a well thought out ethical analysis by applying 3 ethical
principles?
iii. Did you justify ethical judgments with accurate facts and ethical principles?
iv. Did you think about the case from a variety of ethical perspectives?
v. Did you base your analysis upon ethical principles that cannot be easily
criticized?
How will your papers be assessed?
Evaluations of Papers (Percentages)
A. The Case Recap (.10)
i. Do you highlight the key points in the case?
a. Are the facts in the case accurate?
b. Did you focus on the crucial aspects of the case?
c. Did you avoid including non essential or superfluous information?
B. Stakeholders (.10)
i. Who are the primary and secondary stakeholders in the case?
ii. Did you clearly identify from which stakeholder perspective you are analyzing the
material in the case?
a. From whose perspective are you analyzing the case?
C. What is the central technical problem? (.15)
i.. How is the central technical problem related to the ethical problems in the case?
D. What is/are the ethical problem/problems? (.15)
i. What ethical problems do you see in the case?
ii. What is the central ethical problem?
iii. Why is this the central ethical problem?
E. What ethical principles apply to the central ethical problem? (.30)
i. Do you correctly define the ethical principles you use?
ii. Have you correctly applied the ethical principles you use?
iii. Do the principles you use withstand obvious criticisms from other ethical
perspectives?
F. What are your recommendations? (.20)
i. Are your recommendations based upon your ethical analysis?
ii. Do your recommendations link to your ethical analysis?
iii. Rather than stating the obvious ( e. g. this problem could have been easily solved if
… ) what do you recommend for similar cases in the future?
IT Ethics
© Winterberg | Dreamstime.com
Moral
Responsibility
for Computing
Artifacts:
“The Rules”
Keith W. Miller, University of Illinois at Springfield
I
n March 2010, the Poynter
Center for the Study of Ethics
and American Institutions
held a workshop sponsored
by the US National Science Foun
dation. An interdisciplinary group
of philosophers, computer scien
tists, practitioners, and lawyers
gathered to discuss “ethical guid
ance for the research and applica
tion of pervasive and autonomous
information technology” (http://
poynter.indiana.edu/pait).
During the workshop, we started
to develop a short statement about
the ethics of developing computer
systems, and this statement has
since evolved into a document
about moral responsibility. It’s not
a Wiki, but 50 people, including
academics and IT professionals,
have already contributed to it. An
early working title was “Principles
Governing Moral Responsibil
ity for Computing Artifacts,” but
most of the time, we just call it
“The Rules.”
The Rules
Currently, the document includes
a preamble, some definitions, five
rules, and explanations, though it
could change before this column
is published. Anyone can volun
teer to join the Ad Hoc Committee
1520-9202/11/$26.00 © 2011 IEEE
on Responsible Computing and
suggest changes for The Rules by
emailing the coordinator. As the
current coordinator, it’s my job
to circulate new suggestions and
incorporate the accepted changes.
Here, I present a condensed ver
sion of the latest document, “Moral
Responsibility for Computing
Artifacts: Five Rules, Version 27.”
The reasons that the document
has gone through 27 versions
is that the signers have taken a
great deal of care with the words
in The Rules, so the excerpts here
don’t constitute an official ver
sion. Please visit https://edocs.
uis.edu/kmill2/www/TheRules/
moralResponsibilityForComputer
ArtifactsV27.pdf for the full ver
sion, which includes more detailed
explanations for the definitions
and rules.
Preamble
As computing artifacts become in
creasingly complex, some have sug
gested that such artifacts greatly
complicate issues of responsibil
ity. In order to help deal with these
complexities, we propose five rules
as a normative guide for people
who design, develop, deploy, evalu
ate, or use computing artifacts. Our
aim is to reaffirm the importance
Published by the IEEE Computer Society
of moral responsibility for these
artifacts, and to encourage individ
uals and institutions to carefully
examine their own responsibilities
with respect to computing arti
facts. We do not claim that these
rules are exhaustive; professionals,
individuals, and organizations may
choose to take on more responsi
bility than we describe here.
A Working Definition of “Com
puting Artifacts.” We use “com
puting artifact” for any artifact
that includes an executing com
puter program. [This includes]
software applications running on
a general-purpose computer, pro
grams burned into hardware and
embedded in mechanical devices,
robots, phones, webbots, toys,
and programs distributed across
more than one machine… . We
[include] software that’s com
mercial, free, open source, recre
ational, an academic exercise, or a
research tool.
A Working Definition of “Moral
Responsibility.” We use “moral
responsibilit y for computing
artifacts” to indicate that people
are answerable for their behavior
when they produce or use com
puting artifacts, and that their
computer.org/ITPro
57
IT Ethics
actions reflect on their character.1
“Moral responsibility” includes
an obligation to adhere to reason
able standards of behavior, and
to respect others who could be
affected by the behavior. We do
not address legal liability in this
document.
A Working Definition of “Socio
technical Systems.” Each com
puting artifact should be unders tood
in the context of “sociotechnical
systems.” A sociotechnical system
includes people, relationships
between people, other artifacts,
physical surroundings, customs,
assumptions, procedures, and
protocols.2
We acknowledge the impor
tance of sociotechnical systems to
the issue of moral responsibility
for computing artifacts. For ex
ample, a GPS navigator is a com
puting artifact, but in isolation
from the satellites it uses to ascer
tain location, it can’t perform its
function….
[Ignoring] the sociotechnical
systems in which a computing
artifact is embedded is folly, [but]
including all relevant sociotechni
cal systems components in every
discussion of moral responsibil
ity involving a computing artifact
will make it impractical to assign
meaningful responsibility to the
people most directly involved with
that specific artifact. To negotiate
this tension, we first discuss moral
responsibility for computing ar
tifacts in a more focused sense
(Rules 1, 2, and 3), and then place
this discussion into a broader con
text that explicitly includes socio
technical systems (Rules 4 and 5).
Rule 1
The people who design, de
velop, or deploy a computing
artifact are morally responsible
for that artifact, and for the fore
seeable effects of that artifact.
This responsibility is shared
58
IT Pro May/June 2011
with other people who design,
develop, deploy or knowingly
use the artifact as part of a so
ciotechnical system.
Rule 2
The shared responsibility of
computing artifacts is not a
zero-sum game. The respon
sibility of an individual is not
reduced simply because more
people become involved in de
signing, developing, deploying,
or using the artifact. Instead,
a person’s responsibility in
cludes being answerable for the
behaviors of the artifact and
for the artifact’s effects after
deployment, to the degree to
which these effects are reason
ably foreseeable by that person.
… By using the word “foresee
able,” we acknowledge that the
people who design, develop, de
ploy and use artifacts cannot rea
sonably be expected to foresee
all the effects of the artifacts, for
all time. However, implicit in our
use of this word is the expectation
that people make a good faith ef
fort to predict the uses, misuses,
and effects of the deployment;
and to monitor these after de
ployment. Willful ignorance, or
cursory thought, is not sufficient
to meet the ethical challenges of
Rules 1 and 2….
Rule 3
People who knowingly use a
particular computing artifact
are morally responsible for that
use.
The word “knowingly” is prob
lematic in Rule 3, but we think it
is, on balance, appropriate. People
who “use” a particular computing
artifact might not be aware of this
use. For example, a driver might
not have any knowledge of a com
puting artifact embedded in the
car that records data for analysis in
case of a crash. It seems counterintuitive to us to assign moral re
sponsibility to the driver for the
use of that artifact.
However, when someone know
ingly and intentionally uses a par
ticular computing artifact, that
person takes on moral responsi
bility attached to that use. A dra
matic example is when someone
launches a cruise missile at an
enemy target; a more mundane
example is when someone searches
the Web for information about a
prospective employee. The moral
responsibility of a user includes
an obligation to learn enough
about the computing artifact’s
effect to make an informed judg
ment about its use for a particular
application.
It is not our intent to absolve the
users of computing artifacts from
moral responsibility if they are
willfully ignorant about artifacts
or their effects….
Rule 4
People who knowingly design,
develop, deploy, or use a com
puting artifact can do so re
sponsibly only when they make
a reasonable effort to take into
account the sociotechnical sys
tems in which the artifact is
embedded.
Sociotechnical systems are in
creasingly powerful. If people
thoughtlessly produce and adopt
these systems, they are, in our
opinion, being morally irrespon
sible. Ignorance is not a justifi
cation for harms associated with
sociotechnical systems and the
computing artifacts imbedded
in those systems. Security issues
that occur when computing arti
facts are deployed via the Internet
are an example of the interaction
of an artifact and a sociotechnical
system.
Rule 4 is intended to be a pro
gressively heavy burden. It requires
an honest effort to identify and
understand relevant systems, com
mensurate with one’s ability and
depth of involvement with the
artifact and system. Thus, the
burden is heavier for those with
more expertise and more influ
ence over the artifact’s effects and
over the system’s effects. Those in
design and development cannot
shift their burden to the users (see
Rule 2), and users cannot shift the
burden to developers when users’
local knowledge is critical to ap
propriate ethical action….
Rule 5
People who design, develop,
deploy, promote, or evaluate a
computing artifact should not
explicitly or implicitly deceive
users about the artifact or its
foreseeable effects, or about
the sociotechnical systems in
which the artifact is embedded.
Morally responsible use of com
puting artifacts and sociotech
nical systems requires reliable
information about the artifacts
and systems. People who design,
develop, deploy or promote a
computing artifact should provide
honest, reliable, and understand
able information about the arti
fact, its effects, possible misuses,
and, to the extent foreseeable,
about the sociotechnical systems
in which they think the artifact
will be embedded….
Computing Artifacts that are
Not Exceptions to the Rules
No matter how sophisticated
computing artifacts become, the
rules still apply. For example, if an
artifact uses a neural net, and the
designers subsequently are sur
prised by the artifact’s effects, the
rules hold. If a computing artifact
is self-modifying and eventually
becomes quite different from the
original artifact, the rules still
hold. If a computing artifact is a
distributed system or an emerging
system, the rules still hold for the
people associated with the pieces
that are distributed, for the people
associated with the organization
of the overall system, and for the
people responsible for the sys
tem from which the new system
emerged….
A New Community
On 4 March 2011, at the annual
meeting of the Association for
Practical and Professional Eth
ics, a panel of people who helped
write The Rules discussed what
the document means.
Michael Davis (Illinois Insti
tute of Technology) discussed
both similarities and differences
between The Rules and codes of
ethics. For example, the subject
matters are similar, but codes of
ethics are usually aimed at orga
nizations; The Rules are aimed at
people from different professions.
Organizations adopt codes of eth
ics, but individuals sign up for
The Rules.
Chuck Huff (St. Olaf College)
compared The Rules to a pro
phetic voice. Although most of the
five rules are stated as explanations,
Huff views them as inspirational,
not just descriptive. The word
“should” appears explicitly in only
one rule, but the document it
self is in the spirit of “should.” By
signing up for The Rules, people
are embracing the responsibili
ties given there. The Rules can
be viewed as an attempt both to
challenge computing profession
als and users to embrace their re
sponsibilities and to support those
who do.
Ken Pimple (Poynter Institute)
emphasized that a community has
begun to form around The Rules
through the document’s coopera
tive development. The community
already contains academics and
practitioners, computer scientists
and philosophers, and people from
nine different countries, united by
their willingness to publicly assert
their support for The Rules.
I
think people are interested in
The Rules in part because the
pace of technological change
and global reach of computing
and telecommunications systems
are unsettling. We’re hungry for
more clarity about who is respon
sible for what in these increasingly
important sociotechnical systems,
and The Rules are one attempt to
reason together about these diffi
cult issues.
I hope you’re sufficiently in
trigued to read The Rules in their
entirety (available at https://edocs.
uis.edu/kmill2/www/TheRules),
and I invite you to get involved. If
you like The Rules, you can sign
on by emailing me at miller.keith@
uis.edu. If you don’t like The Rules,
you can also get involved by sug
gesting changes. Perhaps the next
version of The Rules will include
some of your ideas about our
moral responsibility in designing,
developing, and employing com
puting artifacts.
References
1. M. Davis, “‘Ain’t No One Here
But Us Social Forces’: Construct
ing the Professional Responsibil
ity of Engineers,” to be published
in Science and Engineering Ethics;
w w w.springerlink.com /content /
33338u607x251074/.
2. C. Huff, “Why a Sociotechnical
System?” ComputingCases.org,
http://computingcases.org/general_
tools/sia/socio_tech_system.html.
3. H. Nissenbaum, “Computing and
Accountability,” Comm. ACM, Jan.
1994, pp. 72–80.
Keith W. Miller is the Schewe Professor in Liberal Arts and Sciences at the
University of Illinois at Springfield. His
research areas are computer ethics and
software testing. Contact him at miller.
keith@uis.edu.
computer.org/ITPro
59
…
Purchase answer to see full
attachment
You will get a plagiarism-free paper and you can get an originality report upon request.
All the personal information is confidential and we have 100% safe payment methods. We also guarantee good grades
Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.
You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.
Read moreEach paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.
Read moreThanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.
Read moreYour email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.
Read moreBy sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.
Read more