تعريف error rate في الإنجليزية الإنجليزية القاموس.
(Bilgisayar) Error Rate is a term that describes the degree of errors encountered during data transmission over a communications or network connection. The higher the Error Rate the less reliable the connection or data transfer will be. It can also refer to a physical hardware device as well as a software application where in many cases, it may be acceptable if a certain number of errors are generated. This is often referred to as the Error Rate and the acceptable level will vary according to the device being used. For instance, Optical Character Recognition (OCR) software, at present will have an error rate around five to ten percent
The ratio of incorrectly received (bits, characters or blocks) to the total amount of transmitted data
A measure of data integrity, given as the fraction of bits which are flawed Often expressed as a negative power of 10, as in 10-6 (a rate of one error in every one million bits)
In many cases, it may be acceptable if an input device generates a certain number of errors This is often referred to as the error rate and the acceptable level will vary according to the input device being used and the business application Optical character recognition, for example, is generally considered a comparatively unreliable means of entering data At present, a typical OCR software package will have an error rate of between five and ten per cent
Either the number of defective transactions or the number of defective steps in a transaction
Ratio of the number of signal elements (or data) incorrectly received to the total number transmitted (2) The probability of an error occurring during the transmission of a message