A random variable is a numerical value which is determined by the outcomes or events of an experiment A random independent variable is one whose levels vary from one replication to another, and are not determined by the experimenter
A variable whose values are random but whose statistical distribution is known. In statistics, a function that can take on either a finite number of values, each with an associated probability, or an infinite number of values, whose probabilities are summarized by a density function. Used in studying chance events, it is defined so as to account for all possible outcomes of the event. When these are finite (e.g., the number of heads in a three-coin toss), the random variable is called discrete and the probabilities of the outcomes sum to
a quantity which may take any of the values of a specified set with a specified relative frequency or probability It is defined by a set of possible values, and by an associated probability function giving the relative frequency of occurrence of each possible value
A function which assigns a numerical value to all possible outcomes of an experiment The values of random variables differ from one observation to the next in a manner described by their probability distribution
a function from the set of all possible outcomes of an event to some subset of the real numbers; e g for the event of rolling a standard die, a random variable could assign the face shown to the set {1, 2, , 6}
(Statistics) A variable characterized by random behavior in assuming its different possible values Mathematically, it is described by its probability distribution, which specifies the possible values of a random variable together with the probability associated with each value
A variable characterized by random behavior in assuming its different possible values Mathematically, it is described by its probability distribution, which specifies the possible values of a random variables together with the probability associated (in an appropriate sense) with each value A random variable is said to be continuous if its possible values extend over a continuum and discrete if its possible values are separated by finite intervals Also called variate See probability theory
Probabilities for specific outcomes are determined by summing probabilities (in the discrete case) or by integrating the density function over an interval corresponding to that outcome (in the continuous case)
"A function that assigns a numerical value to each outcome of an experiment" (Dolciani, 1988) "The outcomes form the sample space of the Random Variable" (Dolciani, Beckenbach, Donnelly, Jurgensen, & Wooton, 1980)
If the possible outcomes are infinite (e.g., the life expectancy of a light bulb), the random variable is called continuous and corresponds to a density function whose integral over the entire range of outcomes equals
A variable that assumes numerical values that are determined by the outcome of an experiment That is, a variable that represents an uncertain numerical outcome (page 153)