Black Box Understanding

Considering the problem from an external point of view, it can be understood as a black box in two ways: as a behavior trying to accomplish a task, or as a software component processing information.

Informal Knowledge

Informally, the problem can be understood in terms of the task it must accomplish. We can consider the task as a behavior interacting with the environment (shooting), requiring information (enemy position and terrain), and producing an outcome (targeted projectile). See Figure 21.1.

Figure 21.1. Graphical representation of the task; the information is used to produce an outcome. The problem involves mapping inputs to outputs.

graphics/21fig01.gif

Such knowledge of the task is refined in the understanding phase of the AI development (see Chapter 7, "Analysis and Understanding"). You can further this informal black box knowledge by establishing the correspondence between the situation and the outcome generally with a case study. In addition, the criteria used to evaluate the behaviors provide an informal indication of the task's complexity. (The description of wall following is longer than obstacle avoidance, for instance.)

Software Specification

Informal knowledge of the task (high level) is often combined with a more formal approach to specify the problem as a software component (low level). The specification phase aims to describe the interfaces with other components (and the platform) by formalizing the data representation (see Chapter 9, "Specifications and Knowledge Representation").

Initial understanding of the problem as a software component is often refined during the application phase with observations made from prototypes. For example, providing the target selection with knowledge of the terrain as well as the enemy is a refinement. A stable formal interface enables developers to analyze the problem theoretically.

Theoretical Analysis

The benefit of the specification is that we can rationalize over the inputs and outputs (the variables of the problem). Indeed, from the representation of the variables, we can deduce the theoretical complexity of the problem which affects the capabilities of the AI and our ability to design a solution (see Figure 21.2).

Figure 21.2. Example problem with six variables, each with two to five values. Both inputs and outputs are treated identically.

graphics/21fig02.gif

The size of the problem essentially depends on the number of input and output configurations. As a reminder, each possible parameter combination forms one configuration. The set of all input configurations is known as the domain, and the output is the codomain.

The magnitude of these domains can be measured by two factors:

  • Dimensions The number of parameters in the problem, including input and output variables

  • Size The number of different values for each of the parameters

The total number of configurations for the problem is calculated by multiplying the size of each dimension (see Figure 21.3). As more parameters are introduced to the problem, the total number of configurations grows exponentially, because we take the product of these values. This is known as the curse of dimensionality.

Figure 21.3. The set of configurations representing the inputs (left), the problem (middle), and the output (right). The size of the problem is the product of the input and output sizes.

graphics/21fig03.gif

Table 21.1 shows an example domain size for a simple problem: deciding whether to attack based on the identity of the enemy, his distance, and the health. This simple problem has total number of 1,656,400 configurations. Adding another parameter, such as enemy health, would increase this size to 167,296,400!

Table 21.1. The Size of a Simple Problem: Deciding Whether to Attack Based on the Identity of the Enemy, Distance, and Health

Parameter

Type

Range

Size

Health

Integer

[0..100]

101

Distance

Integer

[0..1024]

1025

Enemy

Enumeration

[fred,...,jim]

8

Attack

Boolean

[true,false]

2

  

Problem size

1,656,400

The theoretical analysis reveals the worst-case scenario. In practice, however, problems are often simpler.



AI Game Development. Synthetic Creatures with Learning and Reactive Behaviors
AI Game Development: Synthetic Creatures with Learning and Reactive Behaviors
ISBN: 1592730043
EAN: 2147483647
Year: 2003
Pages: 399

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net