This site is 100% ad supported. Please add an exception to adblock for this site.

MITAutodictation

Terms

undefined, object
copy deck
Relational Calculus
Relational calculus consist of two calculi, the tuple relational calculus and the domain relational calculus, that are part of the relational model for databases and provide a declarative way to specify database queries.
Claudius Ptplemy
Claudius Ptolemy (no relation ship to the Egyptian rulers), most active period 127-151 AD, introduced and formalized many of our map concepts. He was also a librarian at the Alexandria Museum, although by this time most of the power in world resided in Rome. He introduced the division of degrees into degrees minutes and seconds. (The notion of 360 degrees in a circle comes from Babylon about 3000 BC. They thought a year was 360 days long and thus made a degree about 1 day).
Eratosthenes
Eratosthenes (276-196BC) first determined the radius of the Earth. Ptolemy III appointed him librarian of the Alexandrian Museum in 240BC. He noted that in Syene (now Aswan), which was up the Nile from Alexandria that on June 21 at high noon, the sun shone to the bottom of a deep well there. No such phenomenon was observed in Alexandria. It was also known at this time that the sun appeared to move north south during the year by about 24 degrees, and the extremes defined what is now known as the tropics (from tropos "to turn"). On the day that the sun was directly overhead at Syene, Eratosthenes measured the angle cast by shadows in Alexandria.
Hipparchus
Hipparchus (190-120BC) introduced the idea of latitude and longitude lines but Ptolemy used them systematically in his Geography book and improved on the mathematics of Hipparchus. Hipparchus also made measurements of the Earth moon distance using eclipse data and estimated the precession of the Earth rotation axis in space.
Data Independence
Data independence is the type of data transparency that matters for a centralized DBMS. It refers to the immunity of user applications to make changes in the definition and organization of data, and vice-versa.
Data Dependency
A data dependency in computer science is a situation whereby a program statement(instruction) refer to the data of preceding statement(instruction). The area of discovering data dependencies among statements (or instructions) is called Dependence analysis in compilers.
Hierarchical Model
In a hierarchical data model, data are organized into a tree-like structure. The structure allows repeating information using parent/child relationships: each parent can have many children but each child only has one parent. All attributes of a specific record are listed under an entity type. In a database, an entity type is the equivalent of a table; each individual record is represented as a row and an attribute as a column. Entity types are related to each other using 1: N mapping, also known as one-to-many relationships. An example of a hierarchical data model would be if an organization had records of employees in a table (entity type) called "Employees". In the table there would be attributes/columns such as First Name, Last Name, Job Name and Wage. The company also has data about the employee's children in a separate table called "Children" with attributes such as First Name, Last Name, and DOB. The Employee table represents a parent segment and the Children table represents a Child segment. These two segments form a hierarchy where an employee may have many children, but each child may only have one parent.
Relational Model
The relational model for database management is a database model based on predicate logic and set theory. It was first formulated and proposed in 1969 by Edgar Codd with aims that included avoiding, without loss of completeness, the need to write computer programs to express database queries and enforce database integrity constraints. "Relation" is a mathematical term for "table", and thus "relational" roughly means "based on tables". It does not refer to the links or "keys" between tables, contrary to popular belief.
Normal forms/Database Normalization
Database normalization is a technique for designing relational database tables to minimize duplication of information and, in so doing, to safeguard the database against certain types of logical or structural problems, namely data anomalies. For example, when multiple instances of a given piece of information occur in a table, the possibility exists that these instances will not be kept consistent when the data within the table is updated, leading to a loss of data integrity. A table that is sufficiently normalized is less vulnerable to problems of this kind, because its structure reflects the basic assumptions for when multiple instances of the same information should be represented by a single instance only.
Early maps were produced for?
Navigation, taxation - area of land -, and military defence information,

Deck Info

10

permalink