Javascript Menu by
MindPapers is now part of PhilPapers: online research in philosophy, a new service with many more features.
 Compiled by David Chalmers (Editor) & David Bourget (Assistant Editor), Australian National University. Submit an entry.
click here for help on how to search

6.4e. The Nature of AI (The Nature of AI on PhilPapers)

See also:
Buchanan, Bruce G. (1988). AI as an experimental science. In James H. Fetzer (ed.), Aspects of AI. Kluwer.   (Google)
Bundy, A. (1990). What kind of field is AI? In Derek Partridge & Y. Wilks (eds.), The Foundations of Artificial Intelligence: A Sourcebook. Cambridge University Press.   (Cited by 6 | Google)
Dennett, Daniel C. (1978). AI as philosophy and as psychology. In Martin Ringle (ed.), Philosophical Perspectives on Artificial Intelligence. Humanities Press.   (Annotation | Google)
Glymour, C. (1988). AI is philosophy. In James H. Fetzer (ed.), Aspects of AI. D.   (Cited by 1 | Google)
Harre, Rom (1990). Vigotsky and artificial intelligence: What could cognitive psychology possibly be about? Midwest Studies in Philosophy 15:389-399.   (Google)
Kukla, André (1989). Is AI an empirical science? Analysis 49 (March):56-60.   (Cited by 4 | Annotation | Google)
Kukla, André (1994). Medium AI and experimental science. Philosophical Psychology 7 (4):493-5012.   (Cited by 4 | Annotation | Google)
Abstract: It has been claimed that a great deal of AI research is an attempt to discover the empirical laws describing a new type of entity in the world—the artificial computing system. I call this enterprise 'medium AI', since it is in some respects stronger than Searle's 'weak AI', and in other respects weaker than 'strong AI'. Bruce Buchanan, among others, conceives of medium AI as an empirical science entirely on a par with psychology or chemistry. I argue that medium AI is not an empirical science at all. Depending on how artificial computing systems are categorized, it is either an a priori science like mathematics, or a branch of engineering
McCarthy, John (online). What is artificial intelligence?   (Cited by 38 | Google | More links)
Minsky, Marvin L. (online). From pain to suffering.   (Google)
Abstract: “Great pain urges all animals, and has urged them during endless generations, to make the most violent and diversified efforts to escape from the cause of suffering. Even when a limb or other separate part of the body is hurt, we often see a tendency to shake it, as if to shake off the cause, though this may obviously be impossible.” —Charles Darwin[1]
Nakashima, H. (1999). AI as complex information processing. Minds and Machines 9 (1):57-80.   (Cited by 2 | Google | More links)
Abstract:   In this article, I present a software architecture for intelligent agents. The essence of AI is complex information processing. It is impossible, in principle, to process complex information as a whole. We need some partial processing strategy that is still somehow connected to the whole. We also need flexible processing that can adapt to changes in the environment. One of the candidates for both of these is situated reasoning, which makes use of the fact that an agent is in a situation, so it only processes some of the information – the part that is relevant to that situation. The combination of situated reasoning and context reflection leads to the idea of organic programming, which introduces a new building block of programs called a cell. Cells contain situated programs and the combination of cells is controlled by those programs
Sloman, Aaron (2002). The irrelevance of Turing machines to AI. In Matthias Scheutz (ed.), Computationalism: New Directions. MIT Press.   (Cited by 9 | Google | More links)
Sufka, Kenneth J. & Polger, Thomas W. (2005). Closing the gap on pain. In Murat Aydede (ed.), Pain: New Essays on its Nature and the Methodology of its Study. MIT Press.   (Google | More links)
Abstract: A widely accepted theory holds that emotional experiences occur mainly in a part of the human brain called the amygdala. A different theory asserts that color sensation is located in a small subpart of the visual cortex called V4. If these theories are correct, or even approximately correct, then they are remarkable advances toward a scientific explanation of human conscious experience. Yet even understanding the claims of such theories—much less evaluating them—raises some puzzles. Conscious experience does not present itself as a brain process. Indeed experience seems entirely unlike neural activity. For example, to some people it seems that an exact physical duplicate of you could have different sensations than you do, or could have no sensations at all. If so, then how is it even possible that sensations could turn out to be brain processes?
Yudkowsky, Eliezer (online). General intelligence and seed AI.   (Google)