A <- 4. Definitions -> C
B
$ B1, B2, or B3 computer system
(O) /TCSEC/ See: Tutorial under "Trusted Computer System
Evaluation Criteria".
$ back door
1. (I) /COMPUSEC/ A computer system feature -- which may be (a) an
unintentional flaw, (b) a mechanism deliberately installed by the
system's creator, or (c) a mechanism surreptitiously installed by
an intruder -- that provides access to a system resource by other
than the usual procedure and usually is hidden or otherwise not
well-known. (See: maintenance hook. Compare: Trojan Horse.)
Example: A way to access a computer other than through a normal
login. Such an access path is not necessarily designed with
malicious intent; operating systems sometimes are shipped by the
manufacturer with hidden accounts intended for use by field
service technicians or the vendor's maintenance programmers.
2. (I) /cryptography/ A feature of a cryptographic system that
makes it easily possible to break or circumvent the protection
that the system is designed to provide.
Example: A feature that makes it possible to decrypt cipher text
much more quickly than by brute-force cryptanalysis, without
having prior knowledge of the decryption key.
Shirey Informational [Page 31]
RFC 4949 Internet Security Glossary, Version 2 August 2007
$ back up
(I) /verb/ Create a reserve copy of data or, more generally,
provide alternate means to perform system functions despite loss
of system resources. (See: contingency plan. Compare: archive.)
$ backup
(I) /noun or adjective/ Refers to alternate means of performing
system functions despite loss of system resources. (See:
contingency plan).
Example: A reserve copy of data, preferably one that is stored
separately from the original, for use if the original becomes lost
or damaged. (Compare: archive.)
$ bagbiter
(D) /slang/ "An entity, such as a program or a computer, that
fails to work or that works in a remarkably clumsy manner. A
person who has caused some trouble, inadvertently or otherwise,
typically by failing to program the computer properly." [NCSSG]
(See: flaw.)
Deprecated Term: It is likely that other cultures use different
metaphors for these concepts. Therefore, to avoid international
misunderstanding, IDOCs SHOULD NOT use this term. (See: Deprecated
Usage under "Green Book".)
$ baggage
(O) /SET/ An "opaque encrypted tuple, which is included in a SET
message but appended as external data to the PKCS encapsulated
data. This avoids superencryption of the previously encrypted
tuple, but guarantees linkage with the PKCS portion of the
message." [SET2]
Deprecated Usage: IDOCs SHOULD NOT use this term to describe a
data element, except in the form "SET(trademark) baggage" with the
meaning given above.
$ baked-in security
(D) The inclusion of security mechanisms in an information system
beginning at an early point in the system's lifecycle, i.e.,
during the design phase, or at least early in the implementation
phase. (Compare: add-on security.)
Deprecated Term: It is likely that other cultures use different
metaphors for this concept. Therefore, to avoid international
misunderstanding, IDOCs SHOULD NOT use this term (unless they also
provide a definition like this one). (See: Deprecated Usage under
"Green Book".)
Shirey Informational [Page 32]
RFC 4949 Internet Security Glossary, Version 2 August 2007
$ bandwidth
(I) The total width of the frequency band that is available to or
used by a communication channel; usually expressed in Hertz (Hz).
(RFC 3753) (Compare: channel capacity.)
$ bank identification number (BIN)1. (O) The digits of a credit card number that identify the issuing bank. (See: primary account number.)
2. (O) /SET/ The first six digits of a primary account number.
$ Basic Encoding Rules (BER)
(I) A standard for representing ASN.1 data types as strings of
octets. [X690] (See: Distinguished Encoding Rules.)
Deprecated Usage: Sometimes incorrectly treated as part of ASN.1.
However, ASN.1 properly refers only to a syntax description
language, and not to the encoding rules for the language.
$ Basic Security Option
(I) See: secondary definition under "IPSO".
$ bastion host
(I) A strongly protected computer that is in a network protected
by a firewall (or is part of a firewall) and is the only host (or
one of only a few) in the network that can be directly accessed
from networks on the other side of the firewall. (See: firewall.)
Tutorial: Filtering routers in a firewall typically restrict
traffic from the outside network to reaching just one host, the
bastion host, which usually is part of the firewall. Since only
this one host can be directly attacked, only this one host needs
to be very strongly protected, so security can be maintained more
easily and less expensively. However, to allow legitimate internal
and external users to access application resources through the
firewall, higher-layer protocols and services need to be relayed
and forwarded by the bastion host. Some services (e.g., DNS and
SMTP) have forwarding built in; other services (e.g., TELNET and
FTP) require a proxy server on the bastion host.
$ BBN Technologies Corp. (BBN)
(O) The research-and-development company (originally called Bolt
Baranek and Newman, Inc.) that built the ARPANET.
$ BCA
(O) See: brand certification authority.
Shirey Informational [Page 33]
RFC 4949 Internet Security Glossary, Version 2 August 2007
$ BCR
(O) See: BLACK/Crypto/RED.
$ BCI
(O) See: brand CRL identifier.
$ Bell-LaPadula model
(N) A formal, mathematical, state-transition model of
confidentiality policy for multilevel-secure computer systems
[Bell]. (Compare: Biba model, Brewer-Nash model.)
Tutorial: The model, devised by David Bell and Leonard LaPadula at
The MITRE Corporation in 1973, characterizes computer system
elements as subjects and objects. To determine whether or not a
subject is authorized for a particular access mode on an object,
the clearance of the subject is compared to the classification of
the object. The model defines the notion of a "secure state", in
which the only permitted access modes of subjects to objects are
in accordance with a specified security policy. It is proven that
each state transition preserves security by moving from secure
state to secure state, thereby proving that the system is secure.
In this model, a multilevel-secure system satisfies several rules,
including the "confinement property" (a.k.a. the "*-property"),
the "simple security property", and the "tranquility property".
$ benign
1. (N) /COMSEC/ "Condition of cryptographic data [such] that [the
data] cannot be compromised by human access [to the data]."
[C4009]
2. (O) /COMPUSEC/ See: secondary definition under "trust".
$ benign fill
(N) Process by which keying material is generated, distributed,
and placed into an ECU without exposure to any human or other
system entity, except the cryptographic module that consumes and
uses the material. (See: benign.)
$ BER
(I) See: Basic Encoding Rules.
$ beyond A1
1. (O) /formal/ A level of security assurance that is beyond the
highest level (level A1) of criteria specified by the TCSEC. (See:
Tutorial under "Trusted Computer System Evaluation Criteria".)
Shirey Informational [Page 34]
RFC 4949 Internet Security Glossary, Version 2 August 2007
2. (O) /informal/ A level of trust so high that it is beyond
state-of-the-art technology; i.e., it cannot be provided or
verified by currently available assurance methods, and especially
not by currently available formal methods.
$ Biba integrity
(N) Synonym for "source integrity".
$ Biba model
(N) A formal, mathematical, state-transition model of integrity
policy for multilevel-secure computer systems [Biba]. (See: source
integrity. Compare: Bell-LaPadula model.)
Tutorial: This model for integrity control is analogous to the
Bell-LaPadula model for confidentiality control. Each subject and
object is assigned an integrity level and, to determine whether or
not a subject is authorized for a particular access mode on an
object, the integrity level of the subject is compared to that of
the object. The model prohibits the changing of information in an
object by a subject with a lesser or incomparable level. The rules
of the Biba model are duals of the corresponding rules in the
Bell-LaPadula model.
$ billet
(N) "A personnel position or assignment that may be filled by one
person." [JCP1] (Compare: principal, role, user.)
Tutorial: In an organization, a "billet" is a populational
position, of which there is exactly one instance; but a "role" is
functional position, of which there can be multiple instances.
System entities are in one-to-one relationships with their
billets, but may be in many-to-one and one-to-many relationships
with their roles.
$ BIN
(O) See: bank identification number.
$ bind
(I) To inseparably associate by applying some security mechanism.
Example: A CA creates a public-key certificate by using a digital
signature to bind together (a) a subject name, (b) a public key,
and usually (c) some additional data items (e.g., "X.509 public-
key certificate").
$ biometric authentication
(I) A method of generating authentication information for a person
by digitizing measurements of a physical or behavioral
Shirey Informational [Page 35]
RFC 4949 Internet Security Glossary, Version 2 August 2007
characteristic, such as a fingerprint, hand shape, retina pattern,
voiceprint, handwriting style, or face.
$ birthday attack
(I) A class of attacks against cryptographic functions, including
both encryption functions and hash functions. The attacks take
advantage of a statistical property: Given a cryptographic
function having an N-bit output, the probability is greater than
1/2 that for 2**(N/2) randomly chosen inputs, the function will
produce at least two outputs that are identical. (See: Tutorial
under "hash function".)
Derivation: From the somewhat surprising fact (often called the
"birthday paradox") that although there are 365 days in a year,
the probability is greater than 1/2 that two of more people share
the same birthday in any randomly chosen group of 23 people.
Birthday attacks enable an adversary to find two inputs for which
a cryptographic function produces the same cipher text (or find
two inputs for which a hash functions produces the same hash
result) much faster than a brute-force attack can; and a clever
adversary can use such a capability to create considerable
mischief. However, no birthday attack can enable an adversary to
decrypt a given cipher text (or find a hash input that results in
a given hash result) any faster than a brute-force attack can.
$ bit
(I) A contraction of the term "binary digit"; the smallest unit of
information storage, which has two possible states or values. The
values usually are represented by the symbols "0" (zero) and "1"
(one). (See: block, byte, nibble, word.)
$ bit string
(I) A sequence of bits, each of which is either "0" or "1".
$ BLACK
1. (N) Designation for data that consists only of cipher text, and
for information system equipment items or facilities that handle
only cipher text. Example: "BLACK key". (See: BCR, color change,
RED/BLACK separation. Compare: RED.)
2. (O) /U.S. Government/ "Designation applied to information
systems, and to associated areas, circuits, components, and
equipment, in which national security information is encrypted or
is not processed." [C4009]
3. (D) Any data that can be disclosed without harm.
Shirey Informational [Page 36]
RFC 4949 Internet Security Glossary, Version 2 August 2007
Deprecated Definition: IDOCs SHOULD NOT use the term with
definition 3 because the definition is ambiguous with regard to
whether or not the data is protected.
$ BLACK/Crypto/RED (BCR)
(N) An experimental, end-to-end, network packet encryption system
developed in a working prototype form by BBN and the Collins Radio
division of Rockwell Corporation in the 1975-1980 time frame for
the U.S. DoD. BCR was the first network security system to support
TCP/IP traffic, and it incorporated the first DES chips that were
validated by the U.S. National Bureau of Standards (now called
NIST). BCR also was the first to use a KDC and an ACC to manage
connections.
$ BLACK key
(N) A key that is protected with a key-encrypting key and that
must be decrypted before use. (See: BLACK. Compare: RED key.)
$ BLACKER
(O) An end-to-end encryption system for computer data networks
that was developed by the U.S. DoD in the 1980s to provide host-
to-host data confidentiality service for datagrams at OSIRM Layer
3. [Weis] (Compare: CANEWARE, IPsec.)
Tutorial: Each user host connects to its own bump-in-the-wire
encryption device called a BLACKER Front End (BFE, TSEC/KI-111),
through which the host connects to the subnetwork. The system also
includes two types of centralized devices: one or more KDCs
connect to the subnetwork and communicate with assigned sets of
BFEs, and one or more ACCs connect to the subnetwork and
communicate with assigned KDCs. BLACKER uses only symmetric
encryption. A KDC distributes session keys to BFE pairs as
authorized by an ACC. Each ACC maintains a database for a set of
BFEs, and the database determines which pairs from that set (i.e.,
which pairs of user hosts behind the BFEs) are authorized to
communicate and at what security levels.
The BLACKER system is MLS in three ways: (a) The BFEs form a
security perimeter around a subnetwork, separating user hosts from
the subnetwork, so that the subnetwork can operate at a different
security level (possibly a lower, less expensive level) than the
hosts. (b) The BLACKER components are trusted to separate
datagrams of different security levels, so that each datagram of a
given security level can be received only by a host that is
authorized for that security level; and thus BLACKER can separate
host communities that operate at different security levels. (c)
The host side of a BFE is itself MLS and can recognize a security
label on each packet, so that an MLS user host can be authorized
Shirey Informational [Page 37]
RFC 4949 Internet Security Glossary, Version 2 August 2007
to successively transmit datagrams that are labeled with different
security levels.
$ blind attack
(I) A type of network-based attack method that does not require
the attacking entity to receive data traffic from the attacked
entity; i.e., the attacker does not need to "see" data packets
sent by the victim. Example: SYN flood.
Tutorial: If an attack method is blind, the attacker's packets can
carry (a) a false IP source address (making it difficult for the
victim to find the attacker) and (b) a different address on every
packet (making it difficult for the victim to block the attack).
If the attacker needs to receive traffic from the victim, the
attacker must either (c) reveal its own IP address to the victim
(which enables the victim to find the attacker or block the attack
by filtering) or (d) provide a false address and also subvert
network routing mechanisms to divert the returning packets to the
attacker (which makes the attack more complex, more difficult, or
more expensive). [R3552]
$ block
(I) A bit string or bit vector of finite length. (See: bit, block
cipher. Compare: byte, word.)
Usage: An "N-bit block" contains N bits, which usually are
numbered from left to right as 1, 2, 3, ..., N.
$ block cipher
(I) An encryption algorithm that breaks plain text into fixed-size
segments and uses the same key to transform each plaintext segment
into a fixed-size segment of cipher text. Examples: AES, Blowfish,
DEA, IDEA, RC2, and SKIPJACK. (See: block, mode. Compare: stream
cipher.)
Tutorial: A block cipher can be adapted to have a different
external interface, such as that of a stream cipher, by using a
mode of cryptographic operation to package the basic algorithm.
(See: CBC, CCM, CFB, CMAC, CTR, DEA, ECB, OFB.)
$ Blowfish
(N) A symmetric block cipher with variable-length key (32 to 448
bits) designed in 1993 by Bruce Schneier as an unpatented,
license-free, royalty-free replacement for DES or IDEA. [Schn]
(See: Twofish.)
Shirey Informational [Page 38]
RFC 4949 Internet Security Glossary, Version 2 August 2007
$ brain-damaged
(D) /slang/ "Obviously wrong: extremely poorly designed. Calling
something brain-damaged is very extreme. The word implies that the
thing is completely unusable, and that its failure to work is due
to poor design, not accident." [NCSSG] (See: flaw.)
Deprecated Term: It is likely that other cultures use different
metaphors for this concept. Therefore, to avoid international
misunderstanding, IDOCs SHOULD NOT use this term. (See: Deprecated
Usage under "Green Book".)
$ brand
1. (I) A distinctive mark or name that identifies a product or
business entity.
2. (O) /SET/ The name of a payment card. (See: BCA.)
Tutorial: Financial institutions and other companies have founded
payment card brands, protect and advertise the brands, establish
and enforce rules for use and acceptance of their payment cards,
and provide networks to interconnect the financial institutions.
These brands combine the roles of issuer and acquirer in
interactions with cardholders and merchants. [SET1]
$ brand certification authority (BCA)
(O) /SET/ A CA owned by a payment card brand, such as MasterCard,
Visa, or American Express. [SET2] (See: certification hierarchy,
SET.)
$ brand CRL identifier (BCI)
(O) /SET/ A digitally signed list, issued by a BCA, of the names
of CAs for which CRLs need to be processed when verifying
signatures in SET messages. [SET2]
$ break
(I) /cryptography/ To successfully perform cryptanalysis and thus
succeed in decrypting data or performing some other cryptographic
function, without initially having knowledge of the key that the
function requires. (See: penetrate, strength, work factor.)
Usage: This term applies to encrypted data or, more generally, to
a cryptographic algorithm or cryptographic system. Also, while the
most common use is to refer to completely breaking an algorithm,
the term is also used when a method is found that substantially
reduces the work factor.
Shirey Informational [Page 39]
RFC 4949 Internet Security Glossary, Version 2 August 2007
$ Brewer-Nash model
(N) A security model [BN89] to enforce the Chinese wall policy.
(Compare: Bell-LaPadula model, Clark-Wilson model.)
Tutorial: All proprietary information in the set of commercial
firms F(1), F(2), ..., F(N) is categorized into mutually exclusive
conflict-of-interest classes I(1), I(2), ..., I(M) that apply
across all firms. Each firm belongs to exactly one class. The
Brewer-Nash model has the following mandatory rules:
- Brewer-Nash Read Rule: Subject S can read information object O
from firm F(i) only if either (a) O is from the same firm as
some object previously read by S *or* (b) O belongs to a class
I(i) from which S has not previously read any object. (See:
object, subject.)
- Brewer-Nash Write Rule: Subject S can write information object
O to firm F(i) only if (a) S can read O by the Brewer-Nash Read
Rule *and* (b) no object can be read by S from a different firm
F(j), no matter whether F(j) belongs to the same class as F(i)
or to a different class.
$ bridge
(I) A gateway for traffic flowing at OSIRM Layer 2 between two
networks (usually two LANs). (Compare: bridge CA, router.)
$ bridge CA
(I) A PKI consisting of only a CA that cross-certifies with CAs of
some other PKIs. (See: cross-certification. Compare: bridge.)
Tutorial: A bridge CA functions as a hub that enables a
certificate user in any of the PKIs that attach to the bridge, to
validate certificates issued in the other attached PKIs.
For example, a bridge CA (BCA) CA1
could cross-certify with four ^
PKIs that have the roots CA1, |
CA2, CA3, and CA4. The cross- v
certificates that the roots CA2 <-> BCA <-> CA3
exchange with the BCA enable an ^
end entity EE1 certified under |
under CA1 in PK1 to construct v
a certification path needed to CA4
validate the certificate of
end entity EE2 under CA2, CA1 -> BCA -> CA2 -> EE2
or vice versa. CA2 -> BCA -> CA1 -> EE1
Shirey Informational [Page 40]
RFC 4949 Internet Security Glossary, Version 2 August 2007
$ British Standard 7799
(N) Part 1 of the standard is a code of practice for how to secure
an information system. Part 2 specifies the management framework,
objectives, and control requirements for information security
management systems. [BS7799] (See: ISO 17799.)
$ browser
(I) A client computer program that can retrieve and display
information from servers on the World Wide Web. Examples: Netscape
Navigator and Microsoft Internet Explorer.
$ brute force
(I) A cryptanalysis technique or other kind of attack method
involving an exhaustive procedure that tries a large number of
possible solutions to the problem. (See: impossible, strength,
work factor.)
Tutorial: In some cases, brute force involves trying all of the
possibilities. For example, for cipher text where the analyst
already knows the decryption algorithm, a brute-force technique
for finding matching plain text is to decrypt the message with
every possible key. In other cases, brute force involves trying a
large number of possibilities but substantially fewer than all of
them. For example, given a hash function that produces an N-bit
hash result, the probability is greater than 1/2 that the analyst
will find two inputs that have the same hash result after trying
only 2**(N/2) randomly chosen inputs. (See: birthday attack.)
$ BS7799
(N) See: British Standard 7799.
$ buffer overflow
(I) Any attack technique that exploits a vulnerability resulting
from computer software or hardware that does not check for
exceeding the bounds of a storage area when data is written into a
sequence of storage locations beginning in that area.
Tutorial: By causing a normal system operation to write data
beyond the bounds of a storage area, the attacker seeks to either
disrupt system operation or cause the system to execute malicious
software inserted by the attacker.
$ buffer zone
(I) A neutral internetwork segment used to connect other segments
that each operate under a different security policy.
Shirey Informational [Page 41]
RFC 4949 Internet Security Glossary, Version 2 August 2007
Tutorial: To connect a private network to the Internet or some
other relatively public network, one could construct a small,
separate, isolated LAN and connect it to both the private network
and the public network; one or both of the connections would
implement a firewall to limit the traffic that could pass through
the buffer zone.
$ bulk encryption
1. (I) Encryption of multiple channels by aggregating them into a
single transfer path and then encrypting that path. (See:
channel.)
2. (O) "Simultaneous encryption of all channels of a multichannel
telecommunications link." [C4009] (Compare: bulk keying material.)
Usage: The use of "simultaneous" in definition 2 could be
interpreted to mean that multiple channels are encrypted
separately but at the same time. However, the common meaning of
the term is that multiple data flows are combined into a single
stream and then that stream is encrypted as a whole.
$ bulk key
(D) In a few published descriptions of hybrid encryption for SSH,
Windows 2000, and other applications, this term refers to a
symmetric key that (a) is used to encrypt a relatively large
amount of data and (b) is itself encrypted with a public key.
(Compare: bulk keying material, session key.)
Example: To send a large file to Bob, Alice (a) generates a
symmetric key and uses it to encrypt the file (i.e., encrypt the
bulk of the information that is to be sent) and then (b) encrypts
that symmetric key (the "bulk key") with Bob's public key.
Deprecated Term: IDOCs SHOULD NOT use this term or definition; the
term is not well-established and could be confused with the
established term "bulk keying material". Instead, use "symmetric
key" and carefully explain how the key is applied.
$ bulk keying material
(N) Refers to handling keying material in large quantities, e.g.,
as a dataset that contains many items of keying material. (See:
type 0. Compare: bulk key, bulk encryption.)
$ bump-in-the-stack
(I) An implementation approach that places a network security
mechanism inside the system that is to be protected. (Compare:
bump-in-the-wire.)
Shirey Informational [Page 42]
RFC 4949 Internet Security Glossary, Version 2 August 2007
Example: IPsec can be implemented inboard, in the protocol stack
of an existing system or existing system design, by placing a new
layer between the existing IP layer and the OSIRM Layer 3 drivers.
Source code access for the existing stack is not required, but the
system that contains the stack does need to be modified [R4301].
$ bump-in-the-wire
(I) An implementation approach that places a network security
mechanism outside of the system that is to be protected. (Compare:
bump-in-the-stack.)
Example: IPsec can be implemented outboard, in a physically
separate device, so that the system that receives the IPsec
protection does not need to be modified at all [R4301]. Military-
grade link encryption has mainly been implemented as bump-in-the-
wire devices.
$ business-case analysis
(N) An extended form of cost-benefit analysis that considers
factors beyond financial metrics, including security factors such
as the requirement for security services, their technical and
programmatic feasibility, their qualitative benefits, and
associated risks. (See: risk analysis.)
$ byte
(I) A fundamental unit of computer storage; the smallest
addressable unit in a computer's architecture. Usually holds one
character of information and, today, usually means eight bits.
(Compare: octet.)
Usage: Understood to be larger than a "bit", but smaller than a
"word". Although "byte" almost always means "octet" today, some
computer architectures have had bytes in other sizes (e.g., six
bits, nine bits). Therefore, an STD SHOULD state the number of
bits in a byte where the term is first used in the STD.
A <- 4. Definitions -> C