amorphous computing: Amorphous computing is inspired by the recent astonishing developments in molecular biology and in microfabrication. Each of these is the basis of a kernel technology that makes it possible to build or grow huge numbers of almost- identical information- processing units, with integral actuators and sensors (e.g. MEMS), at almost no cost. Microelectronic components are so inexpensive that we can imagine mixing them into materials that are produced in bulk, such as paints, gels, and concrete. Such ``smart materials'' will be used in structural elements and in surface coatings, such as skins or paints. [Harold Abelson, Thomas F. Knight, Gerald Jay Sussman, and friends, Amorphous Computing Manifesto, MIT, 1996] http://www.swiss.ai.mit.edu/projects/amorphous/white-paper/amorph-new/amorph-new.html
Amorphous Computing Homepage, Artificial Intelligence, MIT, US http://www.swiss.ai.mit.edu/projects/amorphous/
"Amorphous computing" Communications of the ACM, May 2000 http://www.swiss.ai.mit.edu/projects/amorphous/cacm-2000.html
autonomic computing: An approach to self-managed computing systems with a minimum of human interference. The term derives from the body's autonomic nervous system, which controls key functions without conscious awareness or involvement. [IBM Corp, Autonomic Computing Glossary] http://www.research.ibm.com/autonomic/glossary.html
Beowulf computing: A method of ganging lots of Linux- based computers together to tackle heavy- duty calculation jobs. It's sort of a do- it- yourself, low- budget supercomputer that's proven popular at universities and government labs. Now, the broadening appeal of the technique has led companies such as IBM and Compaq Computer to see Beowulf as a possible product to add to the computing lineup. The move parallels the adoption of the Linux operating system in other parts of the corporate world. [Stephen Shankland "Beowulf Computing Method Makes Business Inroads" Cnet News.com Aug. 5, 1999] http://news.cnet.com/news/0,10000,0-1003-200-345757,00.html
captology: The study of computers as persuasive technologies. This includes the design, research, and analysis of interactive computing products created for the purpose of changing people's attitudes or behaviors. Key Concepts: Computers as Persuasive Technologies, Stanford Univ. Persuasive Technologies Lab, US http://captology.stanford.edu/
computation: Has become an essential component of scientificl research. The great quantity and diversity of the data being generated by different technologies is daunting, and impossible to organize or oversee without computational assistance. Without effective and integrated databases to store and retrieve these data and advanced computational methods such as pattern recognition and other machine learning approaches to analyze and interpret them, the full implications of these data will not be realized.
computational linguistics: Information management & interpretation glossary
computational video: The study and application of the processing of streamed video data. This field of research is emerging from the convergence of two technologies: digital cameras and high performance computing and high bandwidth networks. In addition, past and current research in machine vision has provided some practical solutions to some of the fundamental processing problems inherent in processing video. [Institute for Information Technology, National Research Council, Canada, Research Programs Computational Video] http://iit-iti.nrc-cnrc.gc.ca/templates/itiiit/itiiit2.cfm?CFID=33974&CFTOKEN=93356...
compute farm: Related terms: compute server farm, ranch, server farm.
computer virus: Glossary of terms, McAfee, 100 + definitions, 2002 http://www.mcafee.com/anti-virus/virus_glossary.asp
computers: Narrower terms include high performance computers, supercomputers
Related terms include ASP Active Server Pages, compute farm, informatics, MPP
Massively Parallel Processing, parallel processing, petaflop, teraflop, server
Narrower terms: DCE Distributed Computing Environment, DNA computing, grid computing, high performance computing, molecular computing, molecular computing, quantum computing, utility computing
database: One or more large structured sets of persistent data, usually associated with software to update and query the data. A simple database might be a single file containing many records, each of which contains the same set of fields where each field is a certain fixed width. A database is one component of a database management system.
database management system: (DBMS) A suite of programs which typically manage large structured sets of persistent data, offering ad hoc query facilities to many users. They are widely used in business applications. A database management system (DBMS) can be an extremely complex set of software programs that controls the organisation, storage and retrieval of data (fields, records and files) in a database. It also controls the security and integrity of the database. The DBMS accepts requests for data from the application program and instructs the operating system to transfer the appropriate data.
data mapping: The process of assigning a source data element to a target data element. [Glossary, DM Review] http://www.dmreview.com/master.cfm?NavID=32&KeywordID=a
discretization strategies: See under mesh interface
Distributed Computing Environment DCE: From the Open Software Foundation. (The Open Software Foundation is now called the Open Group.) DCE consists of multiple components which have been integrated to work closely together. ... DCE is called "middleware" or "enabling technology." It is not intended to exist alone, but instead should be bundled into a vendor's operating system offering, or integrated in by a third- party vendor. DCE's security and distributed filesystem, for example, can completely replace their current, non- network, analogs. DCE is not an application in itself, but is used to build custom applications or to support purchased applications. [Open Software Foundation Distributed Computing Environment FAQ, Oct. 1998 ] http://www.faqs.org/faqs/dce/faq/
Related terms: CORBA, OMG; Nanoscience & miniaturization glossary nanocomputer
evolutionary computation: Algorithms & data analysis glossary
Narrower terms: Algorithms & data analysis glossary genetic algorithms, genetic programming
FLOP: Floating point operations per second. A measure of how fast a computer is based on calculations per second. A floating point is a number representation consisting of a mantissa, an exponent, and an assumed radix. The number represented is M multiplied by R raised to the power of E (M*R^E) where R is the radix or base of the number system. (For example, 10 is the radix of the decimal system.) [National Center for Supercomputing Applications, MetaComputer Glossary, Univ. of Illinois, Urbana- Champaign 1995] http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaGlossary.html
Related terms: petaflop, teraflop
filetypes: Extensions after a filename. About 300 explained, including mp3 and txt. at http://www.computeruser.com/resources/dictionary/filetypes.htm
free software: Software in which the source code is by definition freely available to the general public for redistribution, modification, examination or any other conceivable purpose. Similar to "open- source software," except that "open- source" is a relatively recent term coined for marketing purposes by people who wanted to put a more "business- friendly" face on the concept. Free software, as an idea, is usually associated with Richard Stallman, founder of the Free Software Foundation. [Andrew Leonard, The Free Software Project, Free software glossary, Salon.com, 2000] http://cobrand.salon.com/tech/fsp/glossary/
GUI Graphical User Interface: The two most useful GUI’s are the Query interface to the database and the Report/ Analysis interfaces … each interface should do only a handful of tasks. The most common mistake is to keep adding functionality to an interface rather than creating a new interface…The Query tools are just starting to emerge…The unmet need of querying data is in the area of joining internal and external data …The other emerging concept for the query tool is the need for relationships between the data. The most critical need is to have consistent terms used to describe the data. This is often a very difficult task to get scientists across a multi- site company to agree on one. The other complication is legacy data, either having the wrong terms or no terms at all. [Frank Brown "Chemoinformatics: What is it and How does it Impact Drug Discovery" Annual Reports in Medicinal Chemistry 33: 375- 384, 1998]
Related terms: Information management & interpretation glossary ontologies, taxonomies
geek: A definition http://samsara.circus.com/~omni/geek.html
genetic programming: Algorithms & data analysis glossary
geographical information system: (GIS) A computer system for capturing, storing, checking, integrating, manipulating, analysing and displaying data related to positions on the Earth's surface. Typically, a GIS is used for handling maps of one kind or another. These might be represented as several different layers where each layer holds data about a particular kind of feature (e.g. roads). Each feature is linked to a position on the graphical image of a map. Layers of data are organised to be studied and to perform statistical analysis (i.e. a layer of customer locations could include fields for Name, Address, Contact, Number, Area). Uses are primarily government related, town planning, local authority and public utility management, environmental, resource management, engineering, business, marketing, and distribution.
GIS Dictionary http://www.geo.ed.ac.uk/root/agidict/html/welcome.html
grid computing: An ambitious and exciting global effort to develop an environment in which individual users can access computers, databases and experimental facilities simply and transparently, without having to consider where those facilities are located. [RealityGrid, Engineering & Physical Sciences Research Council, UK 2001] http://www.realitygrid.org/information.html
Global Grid Forum http://www.gridforum.org/
Globus Project http://www.globus.org/
Narrower term: desktop grids; Related terms: utility grids, Information management & interpretation glossary semantic grid
high performance computing: Weboepedia definition http://www.webopedia.com/TERM/H/High_Performance_Computing.html
Related terms: Distributed Computing Environment DCE, MPP Massively Parallel Processing, petaflop, supercomputers, teraflop
Human Computer Interface HCI: http://usableweb.com/authors/perlmangary.html
information extraction: Automated ways of extracting unstructured or partially structured information from machine readable files. Compare with information retrieval.
Related term: Information management & interpretation glossary natural language processing
information retrieval: information technology: Information technology plays a key role in helping organizations achieve profitable results and keep competitive forces in check. With the completion of the draft sequence of the human genome and the push for protein data analysis, the life sciences industry is faced with the daunting task of creating computing infrastructures that support a high level of data interpretation. Never before has the need for significant computing power been so great. Cambridge Healthtech Institute, IT and Informatics conference series
information visualization, interoperability, knowledge management: Information management & interpretation glossary
interoperability: Information management & interpretation glossary
LIMS Laboratory Information Management Systems: Drug discovery & development glossary
LSR [Life Sciences Research] group: Focused on the use of CORBA for objects at all levels of software systems for life sciences research. CORBA is implementation language and platform- independent, so specifications adopted by the LSR group can be implemented in the most appropriate language(s) on a variety of hardware and operating systems. Part of OMG. http://www.omg.org/homepages/lsr/FAQ.html#LSR vs BW
legacy systems: Hardware and software applications in which a company has already invested considerable time and money. Legacy systems typically perform critical operations in companies for many years even though they may no longer use state- of- the- art technology. Replacing legacy systems can be disruptive and therefore requires careful planning and appropriate migration support from the manufacturer. [ISPE International Society for Pharmaceutical Engineering Glossary: Resources and glossaries, Michelle Gonzalez] http://www.ispe.org/
Linux: A free Unix- type operating system originally created by Linus Torvalds with the assistance of developers around the world. Developed under the GNU General Public License , the source code for Linux is freely available to everyone. [Linux Home Page] http://www.linux.org/
Linux clusters: Network multiple processors together to form a unified and more powerful computing system, are becoming a major technology in industry. A node within a Linux cluster is the basic unit of processing.
MPP Massively Parallel Processing:
machine-readable: See under metadata
machine-understandable: See under metadata
markup languages: XML eXtensible Markup Language;
markup languages, standards core: Robin Cover, Core Standards for Markup Languages, 2002 http://xml.coverpages.org/coreStandards.html
massively parallel computing: Sequencing and assembling the genome required teraflop clusters; proteomics could require 10 to 100 times as much computing power. Optimal target identification with design of intervention may require petascale computing. Further, simulation and related optimization efforts place much more emphasis on scalable, massively parallel computing than do informatics tasks. Informatics and simulation pose different architectural requirements on computing; informatics involves almost no floating point arithmetic, needs fairly minimal communications capabilities, and tends to be input- output bound. Simulation and optimization are highly dependent on floating point operations, involve less input- output, and require an effective communications fabric among processors.
mathML: Algorithms glossary
memory-mapped data structures: In this approach [to data- level integration without semantic cleaning] subsets of data from various sources are collected, normalized, and integrated in memory for quick access. While this approach performs actual data integration and addresses the problem of poor performance in the federated approach, it requires additional calls to traditional relational databases to integrate descriptive data. While data cleaning is being performed on some of the data sources, it is not being done across all sources or in the same place. This makes it difficult to quickly add new data sources.
mesh interface: There is a wide array of mesh and discretization strategies available to solve application problems and many times it is not clear a priori which is the best strategy for a particular simulation. See Fig 1. The only way to determine the proper choice is to experiment with a number of options. This is both time consuming and difficult because most mesh and discretization tools have very different programming interfaces. To enable this kind of experimentation, and as a first step toward interoperability, the TSTT team is developing a common software interface for its many mesh management infrastructures. A key aspect of our approach is that we do not enforce any particular data structure or implementation with our interfaces, only that certain questions about the mesh can be answered through calls to the interface. The challenges inherent in this type of effort include balancing performance of the interface with the flexibility needed to support a wide variety of mesh types. Further challenges arise when considering the support of many different scientific programming languages. Terascale Simulations Tools and Technologies Center [TSTT], Scientific Discovery Through Advanced Computing, 2003 http://www.osti.gov/scidac/updatesglimm1.html
metacomputer: A collection of computers held together by state- of- the- art technology and "balanced" so that, to the individual user, it looks and acts like a single computer. The constituent parts of the resulting "metacomputer" could be housed locally, or distributed between buildings, even continents. [MetaComputer HomePage, National Center for Supercomputing Applications, Univ. of Illinois Urbana- Champaign, US] http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaHome.html
MetaComputer Glossary of Terms http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaGlossary.html
metadata: Information management & interpretation glossary
middleware: A technology for defining objects and creating interfaces between software systems.
An example of middleware is CORBA. Integrated applications can be built on top of middleware to produce a federated database approach.
Related term: DCE Distributed Computing Environment
modularity: Ensures that, for the particular task at hand, the data will be collected and stored in an appropriate manner - which differs greatly from one level of activity (simply gathering the raw data) to another (storing analyzed data) and from one type of high- throughput system to another. ... The best system is one that employs integration at those levels where it is an advantage but maintains enough modularity to ensure that (1) there are no major compromises regarding how any one type of data is handled and, (2) all the key elements in a researcher’s information system can be adjusted or updated independently.
Modularity Home Page, Raffaele Calabretta, Institute of Psychology, National Research Council, Rome, Italy http://gral.ip.rm.cnr.it/rcalabretta/modularity.html
Related terms: integration, interoperability
molecular computers: Computers whose input, output and state transitions are carried out by biochemical interactions and reactions. [MeSH 2003]
molecular computing: Ruzena Bajcsy, Assistant Director for Computer and Information Science and Engineering at the National Science Foundation, was lead off witness at a September 12  House Science Committee Hearing on "Beyond Silicon Computing: Quantum and Molecular Computing" ... is currently supporting a number of researchers who are exploring physical processes can be exploited as computing substrates - chemical, biomolecular, optical computing via photonics, and quantum systems... Chairman Nick Smith pressed the panel for their visions of where this research would take us in 20 or 30 years. Witnesses suggested applications for non- silicon based computing, including cryptography, pharmaceutical development, protein folding, and data storage and mining. Dr. Bajcsy suggested that very small computers would provide portable devises that would enhance and extend of our sensory capabilities - the vision of an eagle, the olfaction of a dog, or the hearing of a rabbit. [National Science Foundation, Hearing Summary: House Science Committee's Hearing on "Beyond Silicon Computing: Quantum and Molecular Computing" Sept. 12, 2000] http://www.nsf.gov/od/lpa/congress/106/hs_beyondsilicon.htm
Related terms: DNA computing, quantum computing. Or are any of these the same?
Moore's Law: Gordon Moore made his famous observation in 1965, just four years after the first planar integrated circuit was discovered. The press called it "Moore's Law" and the name has stuck. In his original paper, Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue. Through Intel's relentless technology advances, Moore's Law, the doubling of transistors every couple of years, has been maintained, and still holds true today. Intel expects that it will continue at least through the end of this decade. ["Moore's Law, Intel Research, 2002] http://www.intel.com/research/silicon/mooreslaw.htm
munging: A common term in the programmer’s world. Many computing tasks require taking data from one computer system, manipulating it in some way, and passing it to another. Munging can mean manipulating raw data to achieve a final form. It can mean parsing or filtering data, or the many steps required for data recognition. Or it can be something as simple as converting hours worked plus pay rates into a salary cheque. This book shows you how to process data productively with Perl. It discusses general munging techniques and how to think about data munging problems. David Cross, Data Munging with Perl, Manning Publications Co., 2001 http://www.manning.com/cross/
OASIS Organization for the Advancement of Structured Information Systems: A not- for- profit, global consortium that drives the development, convergence and adoption of e-business standards. http://www.oasis-open.org/who/
OASIS Glossary of terms http://www.oasis-open.org/glossary/index.php 50 + terms
OMG Object Management Group: Distributed object computing industry standards group founded in 1989. The OMG is moving forward in establishing CORBA as the "Middleware that's Everywhere" through its worldwide standard specifications: CORBA/IIOP, Object Services, Internet Facilities and Domain Interface specifications, UML and other specifications supporting Analysis and Design. http://www.omg.org/
object based ontologies: Extensively used, good structuring, intuitive. Semantics defined by OKBC standard, Examples: EcoCyc (uses Ocelot) and RiboWeb (uses Ontolingua).
Object- Oriented Modeling OOM: A method for designing software and databases that combines programs and data into self- contained packages called classes, and organizes these classes into a type/ subtype hierarchy. It is an excellent way to design software and databases that have to cope with a lot of picayune detail and many "exceptions to the rule," so long as the basic structure of the problem can be well- represented by a type/ subtype hierarchy, easily distributed, language- independent, and hardware- neutral.
Object- Protocol Model (OPM): Developed initially by members of the Data Management Research and Development Group at Lawrence Berkeley National Laboratory ... aim to support rapid development of complete database systems, construction of powerful system- independent query interfaces on top of relational and flat- file data resources, integration of heterogeneous data resources and applications into a common object- oriented framework, deployment of configurable Web- based query interfaces for single or multiple databases.
open source: While there is agreement on the broad term "open source" as meaning approximately what is captured in the Open Source Definition the term has, ironically, now become so popular that it has lost some of its precision. We strongly encourage everyone who cares about open software to use the term only to describe licenses that conform to the OSD, or software distributed under such licenses; but since the term has passed into more general use, we also encourage people to refer to the "OSI Certified" mark, which has precision and legal force in identifying software distributed under licenses that are known to meet the OSD requirements. [Open Source Initiative FAQ, 2002 ] http://www.opensource.org/advocacy/faq.php
open source software: The Open Source Initiative is a marketing program for free software. It's a pitch for "free software" on solid pragmatic grounds rather than ideological tub-thumping. ... So that it is clear what kind of software we are talking about, we publish standards for open- source licenses. We have created a certification mark, "OSI Certified," to be applied only to software that is distributed under an open- source license that meets criteria set by the Open Source Initiative as representatives of the open software community. We intend this mark to become a widely recognized and valued symbol, clearly indicating that software does, in fact, have the properties that the community has associated with the descriptive term `open source'. [Open Source Initiative FAQ, 2002 ] http://www.opensource.org/advocacy/faq.php
The Cathedral and the bazaar, Eric Steven Raymond http://www.tuxedo.org/~esr/writings/cathedral-bazaar/
pervasive computing: An emerging trend in which computing devices are increasingly ubiquitous, numerous and mobile. [NIST "Pervasive Computing 2001" May 1-2, 2001, Gaithersburg MD] http://www.nist.gov/pc2001/
peta: 1015 quadrillions. SI unit prefixes beyond peta are exa1018 (quintillions), zetta1021 (sextillions) and yotta1024 (septillions) Compare with prefixes for the smallest numbers: Ultrasensitivity glossary atto, femto, micro, nano, pico, yocto, zepto
petaflop: A petaflops computer is more powerful than all of the computers on today's Internet combined. If such a system incorporated a petabyte of memory, it could hold all 17 million books in the Library of Congress or several thousand years' worth of videotapes. To fabricate such a system today from the best price/ performance systems available requires up to 10 million processors and consumes more than one billion watts of power. Its cost would be approximately $25 billion dollars, and the supercomputer would fail every couple of minutes. The system would cover the flight decks of all existing Nimitz-class aircraft carriers or fill up most of the Empire State Building with its hardware. [T. Sterling "In pursuit of a quadrillion operations per second" Insights, NASA, Apr. 1998] http://www.hq.nasa.gov/hpcc/insights/vol5/petaflop.htm
NSF Pursues Petaflop Computers, Oct. 25, 1996 http://www.nsf.gov/od/lpa/news/press/pr9665.htm
Related term: teraflop computing. Broader term: FLOP
quantum computation: A fundamentally new mode of information processing that can be performed only by harnessing physical phenomena unique to quantum mechanics (especially quantum interference). [FAQ, Centre for Quantum Computation, Univ. Oxford, UK, 2001] http://www.qubit.org/oldsite/QuantumComputationFAQ.html
quantum computing: The idea of a computational device based on quantum mechanics was first explored in the 1970's and early 1980's by physicists and computer scientists such as Charles H. Bennett of the IBM Thomas J. Watson Research Center, Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of the University of Oxford, and the late Richard P. Feynman of the California Institute of Technology (Caltech). The idea emerged when scientists were pondering the fundamental limits of computation. They understood that if technology continued to abide by Moore's Law, then the continually shrinking size of circuitry packed onto silicon chips would eventually reach a point where individual elements would be no larger than a few atoms. Here a problem arose because at the atomic scale the physical laws that govern the behavior and properties of the circuit are inherently quantum mechanical in nature, not classical. This then raised the question of whether a new kind of computer could be devised based on the principles of quantum physics. Feynman was among the first to attempt to provide an answer to this question by producing an abstract model in 1982 that showed how a quantum system could be used to do computations. He also explained how such a machine would be able to act as a simulator for quantum physics. In other words, a physicist would have the ability to carry out experiments in quantum physics inside a quantum mechanical computer. Later, in 1985, Deutsch realized that Feynman's assertion could eventually lead to a general purpose quantum computer and published a crucial theoretical paper showing that any physical process, in principle, could be modeled perfectly by a quantum computer. Thus, a quantum computer would have capabilities far beyond those of any traditional classical computer. After Deutsch published this paper, the search began to find interesting applications for such a machine. [Jacob West, The Quantum Computer, An introduction, 2000] http://www.cs.caltech.edu/~westside/quantum-intro.html
Related terms: DNA computing, molecular computing, nanocomputer. Or are any of these the same?
qubit: A key concept in the very new field of quantum computing. The aim is to produce a device which is the quantum equivalent of the digital computer. The qubit (pronounced exactly the same was as the Old Testament measurement) is a 'quantum bit', the analogue at quantum dimensions of the ordinary computer's 1 or 0, on or off, heads or tails binary digit or bit. Unlike such digital representations, a qubit remains in an indeterminate state until it is observed, like a tossed coin that is still spinning. It was shown recently that in theory a quantum computer could solve certain mathematical problems, such as factoring large numbers, much faster than conventional ones, and so could be used, for example, in codebreaking. It might even be possible to employ the 'action at a distance' properties of quantum mechanics to transport information instantaneously over great distances without loss. This may all sound like S[cience] F[iction], but the first two- bit quantum logic gates were actually demonstrated at the end of 1995. [World Wide Words, 1996] http://www.quinion.com/words/turnsofphrase/tp-qub1.htm
RSS RDF Site Summary: A lightweight multipurpose extensible metadata description and syndication format. RDF Site Summary RSS, web resource.org http://web.resource.org/rss/1.0/spec
RSS feeds: O'Reilly XML.com http://www.xml.com/pub/a/2003/04/30/qa.html
SMP Symmetric MultiProcessing: Multiple processors (two or more) share the same memory and operating system. SMP systems are scalable, as more processors can be added as needed. Related term: MMP
scientific computing: Includes database design, graphical interfaces, querying approaches, data retrieval, data visualization and manipulation, data integration through the development of integrated analytical tools, synthesis, and tools for electronic collaboration, as well as computational research including the development of structural, functional, integrative, and analytical models and simulations.
farm: geek.com definition http://www.geek.com/glossary/glossary_search.cgi?s
Netlingo definition http://www.netlingo.com/lookup.cfm?term=server%20farm
TechTarget definition: http://whatis.techtarget.com/definition/0,,sid9_gci213707,00.html
Weboepedia definition http://www.webopedia.com/TERM/S/server_farm.html
Related term: compute farm
server types: Webopaedia http://www.webopedia.com/quick_ref/servers.asp
software interoperability: Information management & interpretation glossary
Webopedia definition http://www.webopedia.com/TERM/S/supercomputer.html
whatis.com definition http://www.swif.uniba.it/lei/foldop/foldoc.cgi?supercomputer
Very fast computers. Often used for graphics, modeling or simulations.
Future of Supercomputing, National Academies of Science, US, in progress 2003 http://www7.nationalacademies.org/cstb/project_supercomputing.html
Related terms: high performance computing, petaflop, teraflop; Protein structure glossary Blue gene
teraFlop (Tflop): 10 12 floating point operations per second (trillions).
The development of massively parallel computers with teraflop speed and the mastering of the associated programming problems will clearly shape new computational solutions for science in coming years. Techniques for the experimental determination increasingly rely on advanced computational tools.
Related term: petaflop computing. Broader term: FLOP
teragrid: A multi- year effort to build and deploy the world's largest, fastest, most comprehensive, distributed infrastructure for open scientific research. When completed, the TeraGrid will include 13.6 teraflops of Linux Cluster computing power distributed at the four TeraGrid sites, facilities capable of managing and storing more than 450 terabytes of data, high- resolution visualization environments, and toolkits for grid computing. These components will be tightly integrated and connected through a network that will initially operate at 40 gigabits per second and later be upgraded to 50- 80 gigabits/ second — 16 times faster than today's fastest research network. [TeraGrid.org Home Page] http://www.teragrid.org/
ubiquitous computing: Ubiquitous computing and wearable computing have been posed as polar opposites even though they are often applied in very similar applications. Here we first outline the advantages and disadvantages of each and propose that the two perspectives have complementary problems. We then attempt to demonstrate that the failing of both ubiquitous and wearable computing can be alleviated by the development of systems that properly mix the two. ... When Mark Weiser coined the phrase "ubiquitous computing" in 1988 he envisioned computers embedded in walls, in tabletops, and in everyday objects. In ubiquitous computing, a person might interact with hundreds of computers at a time, each invisibly embedded in the environment and wirelessly communicating with each other [Weiser, 1993]. Closely related to the ubiquitous computing vision is the more centralized idea of smart rooms, where a room might contain multiple sensors that keep track of the comings and goings of the people around [Pentland, 1996]. [Bradley J. Rhodes et. al, Wearable computing meets ubiquitous computing: Reaping the best of both worlds, The Proceedings of The Third International Symposium on Wearable Computers (ISWC '99), San Francisco, CA, October 18-19, 1999, pp. 141-149] http://web.media.mit.edu/~rhodes/Papers/wearhive.html
Has roots in many aspects of computing. In its current form, it was first articulated by Mark Weiser in 1988 at the Computer Science Lab at Xerox PARC. [Ubiquitous computing, Xerox PARC Sandbox Server] http://www.ubiq.com/hypertext/weiser/UbiHome.html
Ubiquitous computing, MIT Media Lab special issue IBM Systems Journal 39 (384) 2000 http://www.research.ibm.com/journal/sj39-34.html
Related term: pervasive computing
utility computing: Computing power on demand (similar to electricity). Sun, HP [Hewlett Packard] and IBM have utility computing initiatives.
virtualization: TechTarget definition http://searchstorage.techtarget.com/sDefinition/0,,sid5_gci499539,00.html
wrappers: integrated databases
XACML Extensible Access Control Markup Language: The purpose of this TC [Technical Committee] is to define a core schema and corresponding namespace for the expression of authorization policies in XML against objects that are themselves identified in XML http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=xacml
XML eXtensible Markup Language : The universal format for structured documents and data on the Web. [W3C, "Extensible Markup Language (XML)" 2002] http://www.w3.org/XML/
Describes a class of data objects called XML documents and partially describes the behavior of computer programs which process them. XML is an application profile or restricted form of SGML, the Standard Generalized Markup Language. By construction, XML documents are conforming SGML documents."
"XML is primarily intended to meet the requirements of large- scale Web content providers for industry- specific markup, vendor- neutral data exchange, media- independent publishing, one- on- one marketing, workflow management in collaborative authoring environments, and the processing of Web documents by intelligent clients. It is also expected to find use in certain metadata applications. [Robin Cover, XML Cover Pages, 2002]
Well-formed XML Documents, Bonnie SooHoo, WebReview, Aug. 4, 2000. http://www.webreview.com/2000/08_04/webauthors/08_04_00_4.shtml
XML in 10 points http://www.w3.org/XML/1999/XML-in-10-points
XML schema: Express shared vocabularies and allow machines to carry out rules made by people. They provide a means for defining the structure, content and semantics of XML documents. [W3C consortium "XML schema"] http://www.w3.org/XML/Schema
ACM Computing Classification System, Association of Computing Machinery, 1998 http://www.acm.org/class/1998/ Currently valid  no definitions
Cnet glossary, http://www.cnet.com/Resources/Info/Glossary/index.html
[FOLDOC] Free On-line Dictionary of Computing, Denis Howe, 2001. 13,000+ terms. http://foldoc.doc.ic.ac.uk/foldoc/index.html
Free software glossary, Leonard Andrew, The Free Software Project, Salon.com, 2000, 66 definitions http://cobrand.salon.com/tech/fsp/glossary/
DM Review Glossary, http://www.dmreview.com/master.cfm?NavID=32&KeywordID=a
Geek.com Technical Glossary, 1996-2002, 2000+ definitions. http://www.geek.com/glossary/glossary_search.htm
Howe, Walt, Glossary of Internet Terms, 2002, 360 + terms http://www.walthowe.com/glossary/
Jargon File 4.3.1, June 2001 http://www.tuxedo.org/~esr/jargon/jargon.html
NetGlos, English http://wwli.com/translation/netglos/glossary/glossary.htmlMultilingual, 200 definitions. http://wwli.com/translation/netglos/netglos.html
Lycos Tech Glossary 2002 http://webopedia.lycos.com/
Microsoft Lexicon or Microspeak made easier, Ken Barnes et. al, 1995-1998, 150 + terms. http://www.cinepad.com/mslex.htm
National Center for Supercomputing Applications, MetaComputer Glossary, Univ. of Illinois, Urbana- Champaign 1995] 45 definitions. http://archive.ncsa.uiuc.edu/Cyberia/MetaComp/MetaGlossary.html
Tech WordSpy, Logophilia Limited, 1991- 2002, 500+ terms. http://www.logophilia.com/TechWordSpy/index.asp
W3C Glossaries, http://www.w3.org/Glossary Links to 6 web, internet, W3c, hypermedia glossaries, acronyms and 5 specialized glossaries [document object model, math, quality assurance, web accessibility initiative and web services.
whatis.com Information Technology encyclopedia. About 3,000 + definitions. http://whatis.techtarget.com/
Alpha glossary index