<?xml version="1.0" encoding="UTF-8" ?>
<modsCollection xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.loc.gov/mods/v3" xmlns:slims="http://slims.web.id" xsi:schemaLocation="http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-3.xsd">
<mods version="3.3" id="21455">
 <titleInfo>
  <title>AI Ethics (The MIT Press Essential Knowledge series)</title>
 </titleInfo>
 <name type="Personal Name" authority="">
  <namePart>Mark Coeckelbergh</namePart>
  <role>
   <roleTerm type="text">Additional Author</roleTerm>
  </role>
 </name>
 <typeOfResource manuscript="no" collection="yes">mixed material</typeOfResource>
 <genre authority="marcgt">bibliography</genre>
 <originInfo>
  <place>
   <placeTerm type="text">Islamabad</placeTerm>
   <publisher>Al Hafiz Traders</publisher>
   <dateIssued>2020</dateIssued>
  </place>
 </originInfo>
 <language>
  <languageTerm type="code">en</languageTerm>
  <languageTerm type="text">English</languageTerm>
 </language>
 <physicalDescription>
  <form authority="gmd">Book</form>
  <extent>435</extent>
 </physicalDescription>
 <note>Artificial intelligence powers Google?s search engine, enables Facebook to target advertising, and allows Alexa and Siri to do their jobs. AI is also behind self-driving cars, predictive policing, and autonomous weapons that can kill without human intervention. These and other AI applications raise complex ethical issues that are the subject of ongoing debate. This volume in the MIT Press Essential Knowledge series offers an accessible synthesis of these issues. Written by a philosopher of technology, AI Ethics goes beyond the usual hype and nightmare scenarios to address concrete questions.&#13;
 &#13;
Mark Coeckelbergh describes influential AI narratives, ranging from Frankenstein?s monster to transhumanism and the technological singularity. He surveys relevant philosophical discussions: questions about the fundamental differences between humans and machines and debates over the moral status of AI. He explains the technology of AI, describing different approaches and focusing on machine learning and data science. He offers an overview of important ethical issues, including privacy concerns, responsibility and the delegation of decision making, transparency, and bias as it arises at all stages of data science processes. He also considers the future of work in an AI economy. Finally, he analyzes a range of policy proposals and discusses challenges for policymakers. He argues for ethical practices that embed values in design, translate democratic values into practices and include a vision of the good life and the good society.</note>
 <note type="statement of responsibility">Mark Coeckelbergh</note>
 <subject authority="">
  <topic>Computer Science Data Science</topic>
 </subject>
 <subject authority="">
  <topic>1. Computer Science 2. Software Engineering 3. Dat</topic>
 </subject>
 <subject authority="">
  <topic>Data Science Information Technology</topic>
 </subject>
 <classification>170</classification>
 <identifier type="isbn">9780262357074</identifier>
 <location>
  <physicalLocation>NUML LIBRARY RAWALPINDI (National University of Modern Languages) NUML library is the state of art which equipped “RESEARCH FACILITATION CENTRE” with latest computers for readers to access the digital library of more than 23000 research journals and 130000 online books and E-Library of NUML-Rawalpindi.</physicalLocation>
  <shelfLocator>170</shelfLocator>
  <holdingSimple>
   <copyInformation>
    <numerationAndChronology type="1">14203</numerationAndChronology>
    <sublocation></sublocation>
    <shelfLocator></shelfLocator>
   </copyInformation>
  </holdingSimple>
 </location>
 <slims:image>14203.jpg.jpg</slims:image>
 <recordInfo>
  <recordIdentifier>21455</recordIdentifier>
  <recordCreationDate encoding="w3cdtf">2025-09-04 18:32:53</recordCreationDate>
  <recordChangeDate encoding="w3cdtf">2026-02-03 16:57:00</recordChangeDate>
  <recordOrigin>machine generated</recordOrigin>
 </recordInfo>
</mods>
</modsCollection>