Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
17,021 result(s) for "Computer science History."
Sort by:
The computing universe : a journey through a revolution
\"Computers now impact almost every aspect of our lives, from our social interactions to the safety and performance of our cars. How did this happen in such a short time? And this is just the beginning. In this book, Tony Hey and Gyuri Pâapay lead us on a journey from the early days of computers in the 1930s to the cutting-edge research of the present day that will shape computing in the coming decades. Along the way, they explain the ideas behind hardware, software, algorithms, Moore's Law, the birth of the personal computer, the Internet and the Web, the Turing Test, Jeopardy's Watson, World of Warcraft, spyware, Google, Facebook, and quantum computing. This book also introduces the fascinating cast of dreamers and inventors who brought these great technological developments into every corner of the modern world. This exciting and accessible introduction will open up the universe of computing to anyone who has ever wondered where his or her smartphone came from\"-- Provided by publisher.
It began with Babbage : the genesis of computer science
As a field, computer science occupies a unique scientific space, in that its subject matter can exist in both physical and abstract realms. An artifact such as software is both tangible and not, and must be classified as something in between, or \"liminal.\" The study and production of liminal artifacts allows for creative possibilities that are, and have been, possible only in computer science. In It Began with Babbage, computer scientist and writer Subrata Dasgupta examines the distinct history of computer science in terms of its creative innovations, reaching back to Charles Babbage in 1819. Since all artifacts of computer science are conceived with a use in mind, the computer scientist is not concerned with the natural laws that govern disciplines like physics or chemistry; instead, the field is more concerned with the concept of purpose. This requirement lends itself to a type of creative thinking that, as Dasgupta shows us, has exhibited itself throughout the history of computer science. More than any other, computer science is the science of the artificial, and has a unique history to accompany its unique focus. The book traces a path from Babbage's Difference Engine in the early 19th century to the end of the 1960s by when a new academic discipline named \"computer science\" had come into being. Along the way we meet characters like Babbage and Ada Lovelace, Turing and von Neumann, Shannon and Chomsky, and a host of other people from a variety of backgrounds who collectively created this new science of the artificial. And in the end, we see how and why computer science acquired a nature and history all of its own.
The computer : a very short introduction
Computers have changed so much since the room-filling, bulky magnetic tape running monsters of the mid 20th century. They now form a vital part of most people's lives. And they are more ubiquitous than might be thought - you may have more than 30 computers in your home: not just the desktop and laptop but think of the television, the fridge, the microwave. But what is the basic nature of the modern computer? How does it work? How has it been possible to squeeze so much power into increasingly small machines? And what will the next generations of computers look like? In this Very Short Introduction, Darrel Ince looks at the basic concepts behind all computers; the changes in hardware and software that allowed computers to become so small and commonplace; the challenges produced by the computer revolution - especially whole new modes of cybercrime and security issues; the Internet and the advent of 'cloud computing'; and the promise of whole new horizons opening up with quantum computing, and even computing using DNA-- Source other than Library of Congress.
The second age of computer science : from ALGOL genes to neural nets
Between the genesis of computer science in the 1960s and the advent of the World Wide Web around 1990, computer science evolved in significant ways. The author has termed this period the \"second age of computer science.\" This book describes its evolution in the form of several interconnected parallel histories.
Fundamental concepts in computer science
This book presents fundamental contributions to computer science as written and recounted by those who made the contributions themselves. As such, it is a highly original approach to a “living history” of the field of computer science. The scope of the book is broad in that it covers all aspects of computer science, going from the theory of computation, the theory of programming, and the theory of computer system performance, all the way to computer hardware and to major numerical applications of computers.
The innovators : how a group of hackers, geniuses, and geeks created the digital revolution
\"Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson's revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens. What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail? In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron's daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page. This is the story of how their minds worked and what made them so inventive. It's also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative. For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen\"-- Provided by publisher.
Computing
The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of \"smart\" hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing's technological development: digitization--the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by \"Moore's Law\"; and the human-machine interface. Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word \"digital\" in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor. Ceruzzi's account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a \"minicomputer\" to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.
The Innovators : How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution
Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson's story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and a guide to how innovation really works. What talents allowed certain inventors and entrepreneurs to turn their disruptive ideas into realities? What led to their creative leaps? Why did some succeed and others fail? In his exciting saga, Isaacson begins with Ada Lovelace, Lord Byron's daughter, who pioneered computer programming in the 1840s. He then explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee and Larry Page. This is the story of how their minds worked and what made them so creative. It's also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative. For an era that seeks to foster innovation, creativity and teamwork, this book shows how they actually happen.
Meilensteine der Rechentechnik
Die Anfänge der Informatik liegen bereits im Dunkeln. In diesem Buch werden ausgewählte Meilensteine der Rechentechnik und der Frühzeit der Informatik vorgestellt. Grundlage dafür sind u. a. Aufsehen erregende Funde von Geräten und Schriften, die in den letzten Jahren gemacht wurden: historische Rechentische, weltgrößte Rechenwalze, weltweit älteste erhaltene Tastenaddiermaschine, bisher unbekannte Unterlagen zum Erfinder Zuse. Zur Sprache kommen Analog- wie Digitalrechner: Rechenrahmen, Rechentische, mechanische Rechenmaschinen, Rechenschieber, elektronische Rechner usw. Zahlreiche Tabellen vermitteln eine weltweite Übersicht über die ersten Digitalrechner. Einen Schwerpunkt bilden die deutschsprachigen Länder: Deutschland, Österreich, Schweiz, Liechtenstein, mit einer umfassenden Darstellung von mechanischen Rechenmaschinen aus der Schweiz. Zeittafeln geben einen Überblick über frühe amerikanische, britische und deutsche Rechenautomaten. Der Verfasser geht auch der heiklen Frage nach: Wer hat den Computer erfunden? Eine mehrsprachige Bibliografie mit über 3000 Einträgen rundet den Band ab. Das allgemein verständliche Werk richtet sich an alle, die sich mit der Geschichte der Rechentechnik und der Informatik befassen.