Quake Editing Primer v0.2

This document is intended as a primer for Quake Editing. It is part of the Quake Documentation Project I am maintaining. Overview of sections:

Disclaimer: This document is delivered without warranty, in the hope that it will be useful blahdeblah. It is by no means an official document, and no relation to id Software is implied. Currently there is virtually no way to confirm or check any of the statements made here. -- b., June 19th, 1996.


Coordinate Systems

World coordinates are defined in a right handed coordinate system (RHS). Transformed world coordinates again are defined in a right handed system, with X pointing to the right side, and Y pointing upwards on the screen, thus Z pointing towards the observer. Distance to any visible objective is thus negative. The major axises constitute Axial Planes.

Units of Measurement

With Brushes, Worlds, MAP files integer coordinates are used. With BSP lookups, floating point values are used. It might be necessary to consider floating point with Templates, too.

The basic unit of measurement is defined by the texel. If no scaling is applied, a 1x1 surface is exactly matched by one texel. For the time being, it is convenient to assume that the units of measurement as described in Scott Amspokers "DOOM Metrics" are approximately right:

In real world units, vertically 1 unit would be approximately 3cm, or 1 foot is approx. 10 units. Horizontally, with DOOM distortion of the FOV 1 unit has been around 2cm, or 16 units equaled 1 foot.


These definitions are roughly based, but not identical to those used in the UQS or the QuakeEd help.


A position in space, described by coordinates. A point in two-dimensional space is always called Point2D throughout this document, and has X and Y coordinates. A Point or Point3D has X, Y and Z coordinates: ( X, Y, Z ).


A Vertex is a point in space, associated with a Plane or a Facet or a Surface. In DOOM, use of a simplified Vertex2D has been made possible by restricting the world geometry. In this document, we will almost always talk about Vertex3D vertices, or Vertices for convenience.

In principle, three coordinates of a Point describe a Vertex, however, a Vertex is more complicated than a Point. A vertex might describe texture space loation as well as (s,t), and thus is a point in 5D space: ( X, Y, Z, s, t ). In addition, a Vertex has a sense of direction, i.e. a normal vector pointing to the outside of the object it is associated with. This normal could simply be implied by the polygon the vertex is part of, or given explicitely.

Vertices are used to define Planes for Brushes. Unfortunately, the early MAP files used by QuakeEd and related development tools use a not too well designed definition of texture space coordinates with the Facets of each Brush. The issue of Naturally Aligned Textures is related to this.


A Plane is an infinite object. It separates space into two half spaces (as employed by partition planes used with BSP). Two intersecting Planes define an Edge. Three intersecting Planes define a Vertex.

In Quake, Planes are defined in two different ways: by three points (Brushes), or by a normal and and offset (BSP file). The Brush Planes are specified by three points in clockwise order. The points that define the plane are not necessarlily points on the Brush, but just three non colinear points on the Plane. A typical Plane definition within a Brush looks like this:

 ( 192 512 128 ) ( 208 512 128 ) ( 208 512 0 ) TECH07_2 0 0 0 1 1
This representation has been chosen instead of an explicit normal/distance plane representation because it allows for keeping perfect precision with integral values: it guarantees integer values for anything we create in an editor, lessening concern over floating point creep.

The trailing zeros in a plane like

  "( 0 816 0 ) ( 0 512 0 ) ( 16 512 0 ) TECH07_2    0 0 0 1.0 1.0"
are supposedly s_ofs, t_ofs, rot, s_scale and t_scale, all with respect to the NATs. Note that the scaling is always 1 currently, while NATs implicitely scale a texture by 1.41 in worst case, and that rotation is zero. I do not know whether rotation and scaling are actually useable right now. The texture offsets might not be restricted to multiples of 16 anymore. Finally, it is open whether negative s_scale or t_scale allows for mirroring.

In addition, Planes are classified according to the major orientation with respect to the World's coordinate system: John Carmack seems to call this Axial Planes. This is used for two purposes: Naturally Aligned Textures, and the Lighting Maps. There will be no distoprition with both on Axial Planes only.


A convex polygon, e.g. each face of a convex polyhedron. Each facet has vertices and edges, the former shared with any number of other faces, the latter shared with precisely one other facet.

With respect to Brushes, the most important thing to recognize is that a Brush only implicitely defines the Facets, and that the Polygons in the World are created from Facets by taking into account intersection of Brushes with Brushes, and that those Polygons again might be split during BSP calculation. In the end, you will have a lot different and probably a lot more Polygons.


QuakeEd's approach to map editing uses "building blocks", what id calls "brushes". A Brush is the basic geometric building block, namely a convex polyhedron specified by four or more Planes. The Facets are defined by Planes, and the actual extent of each Facet (the polygon as edges and vertices) is implicitely defined by Plane intersections. Each of the Facets can have a seperate texture for each facet.

Basically, the idea of a Brush is to create and handle solid bodies of same material. The term Brush somehow indicates the concept of painting a texture on an empty volume of space. This source format for Quake levels has been invented because it is conceptually very simple and robust. There is no way to create invalid data, like the HOM effects in DOOM. There cannot be double sided polygons. I do not know how "qbsp" and the renderer handle a solid Brush with zero volume, but I hope it produces an exception. The editor might have to check for invalid Plane definitions (those that have identical normal vector) within a Brush.

However, the so-called solid blocks of material are hollow (only a boundary representation is created). More important, you can freely interpenetrate brushes, and the qbsp will chop things up as needed. There are no restrictions whatsoever on the geometry of a map, but 90% of the brushes tend to be simple axial blocks, i.e. rectangular, "cuboids", with six Planes, thus six Facets, each one perfectly aligned with the Axial Planes/world coordinate axises. A large map may have over 1000 brushes.

With respect to the Template concept, a Brush is an instantiated Brush Template. It has been scaled and mapped to integer coordinates again, which might affect its shape if it is a complicated but small polyhedron (e.g. approximating a sphere). It has textures, or an external texture entity. A Brush cannot be scaled easily. The texture alignment and even the shape might change. If NATs have to be used, a Brush cannot be rotated without affecting the texturing. Moving a Brush with NATs means that texture alignment is guaranteed only for those surfaces sharing the same Basic Orientation. All convex polyhedra have to have Facets for at least four Basic Orientations, there will be no seamless NAT.

Obviously, it takes at least 4 planes to define a building block (a tetrahedron). Typically, Brushes with of six surfaces will be used, that might look like this:

  ( 192 512 128 ) ( 208 512 128 ) ( 208 512 0 )   SOME_TEX   0 0 0 1.0 1.0
  ( 192 816 128 ) ( 192 512 128 ) ( 192 512 0 )   TEX1       0 0 0 1.0 1.0
  ( 192 704 128 ) ( 176 704 128 ) ( 176 704 0 )   TEX1       0 0 0 1.0 1.0
  ( 208 512 128 ) ( 208 816 128 ) ( 208 816 0 )   TEX2       0 0 0 1.0 1.0
  ( 208 512 128 ) ( 192 512 128 ) ( 192 816 128 ) TOP_TEX    0 0 0 1.0 1.0
  ( 192 816 0 ) ( 192 512 0 ) ( 208 512 0 )       BOTTOM_TEX 0 0 0 1.0 1.0


I will introduce a definition that is a bit different from the official one, for sake of clarity. Do not mistake the Entity as coined here with the Object.


Entity is a term used in event simulation. In this case, it means an independend part of the virtual environment. Entities have a current state, and are subject to rules on how to change a state, called the laws of behaviour. In Quake, each object is an entity: the world itself, any moving part of the world, any monster. Lights, doors, monsters, etc. are all entities.

As there is not clear separation of physical objects (bodies, avatars) and entities (the abstract representation of a modeling component in a event/state-driven simulation), entities have both physical and more abstract properties. Thus a Quake Entity consists of zero or or more key/value attribute pairs and a set of zero or more solid brushes. Entities can either have a fixed, non-modifiable size (monsters, lights, etc), or can be represented by an arbitrary collection of brushes (doors, plats, bridges, etc).

From QuakeEd: An entity is created by selecting one or more brushes in the world, then selecting a class in the Entity Browser, then clicking Create Entity (or double click the class name). If the entity class has a fixed size, the selected brushes are removed and a fixed sized brush is put in their place. Otherwise, the selected brushes are removed from the world entity and made the contents of the newly created entity.

Key/Value Pairs

A key/value pair is just two strings that specify some information about the Entity, like "classname" "monster_demon" or "light" "300". The order of the key/value pairs is not significant. The usable keys are defined in the .qc progs. They are fully user extendable.


Quake utilizes rendering strategies for visible surface determination, culling and clipping that work well with densely occluded, mostly static environments. The static enviroment is called a World. Thus borders of the World limit the range of view (fake skys negeclected). The World is an Entity, too, in Quake always entity 0. Any other object is restricted in its movements to this World.


Texture Definition

Introduced by QuakeEd and the NAT-related approach to texture mapping, a Texture Definition is given by a texture name with offset numbers and flip flags. This determines what will be drawn on a Brush Facet generated from the Plane the Texture Definition is associated with. In an editor like QuakeEd, all Facets of a Brush will copy the same Texture Definition by default, and you will have to explicityly change single Facets if desired.

Texture Definitions defined by offsets, a fixed set rotations, and mirror flips, do not allow for scaling. Implicitely, scaling is determined by the Planes orientation with respect to the world coordinate system. This might change with later released of Quake, and in general an editor should internally use a more general representation.

Texture Alignment

With DOOM, Texture Alignment made the difference between well done and mediocre levels. Texture Alignment has to be done manually, even if restrictions like NATs are employed. This has been a problem even within the restricted world of DOOM, in a 3D environment, things are going to be worse. The most important thing to recognize is that Texture Alignment has to be done for each Facet separately, but with the adjacent Facets within view, and that it could only be done interactively: the consequences of a given change must be visible during the change.

In the words of John Carmack: "There are a couple restrictions on texturing. In hindsight, there didn't need to be, but I don't have the time to fix it right now."

Textures are "naturally aligned", rather than tied to a specific brush. This means that if you build steps next to each other, the sides will not have any texture seams, no matter what size the brushes are. If necessary, the texture plane on each brush face can be offset in multiples of 16 pixels, and the texture axis can be negated or interchanged.

Naturally Aligned Textures

Let us take a look at those restrictions that come with NAT's ("naturally aligned textures"). Quake qtest1 uses a simplified mapping from 3D to 2D world space, and does not assign texture space coordinates s,t to 3D vertices, but instead to their projected 2D equivalents. The projection is determined by assigning each surface's plane to one of six classes:

The projection thus takes into account the orientation of the surface. This means that a surface

   (-1,-1,0), (+1,-1,0), (+1,+1,0), (+1,-1,0)
counterclockwise, in a right-handed world coordinate system, facing -z, and e.g. the surfaces
   (-1,-1, Z ),   (+1,-1, Z ), (+1,+1, 0 ), (+1,-1, 0 )
   (-1,-1, 0 ),   (+1,-1, Z ), (+1,+1, Z ), (+1,-1, 0 ) 
   (-1,-1, 0 ),   (+1,-1, 0 ), (+1,+1, Z ), (+1,-1, Z )
   (-1,-1, Z ),   (+1,-1, - ), (+1,+1, 0 ), (+1,-1, Z )
will have the same mapping, as long as +1 > Z > -1. This means that sloped surfaces that are still facing the same of the six directions have the same projection to texture space, which in turn means that the texture will be scaled up to a factor of 1.41 depending on the surface's slope. In other words, the texture tiling width seems to depend on the surface's orientation, or, with alpha being the angle for a given slice, 1/cos(alpha) gives a scaling between 1 and 1.41, thus every second pixel is drawn twice in worst case.

Note that it is actually either +1 >= Z or Z >= -1, because those surfaces have to be mapped to one direction too.

Texture Transformations

The NAT's determine the texture space coordinates of a vertex. However, there are said modifiers that allow for tranformations of the textures:

This means that moving a group of Brushes will change the textures, while keeping the natural alignment on adjacent brushes, and any other operations applied to the brush will be kept as well. In consequence, texture alignment preserving shifts need to correct s_ofs, t_ofs appropriately.

Mipmapping Textures

Anti-aliasing, preprocessed sampling, dithering with indexed color, limited mipmap resolution.

Transparency and Translucency

Quake is an indexed color engine, and does not support RGBA (although the lightmaps are Alpha channel maps, in a way). There is no mipmapping hack for billboards/sprites, and no indexed color lookup for translucency and colored glass. For the same reason, water, slime, lava surfaces are opaque.

A perfectly transparent, invisible obstacle can be created using a Brush with a "clip" texture. It will be used to create the clip hull, but not the draw hull.

A certain amount of translucency can be created exploiting the properties of dynamic lighting.


To understand lighting, you have to understand the light source entities, the preprocessing of the lighting maps, and the actual algorithm the engines uses to calculate the light level for each pixel during the surface processing.

Indexed color light levels

Quake is an indexed color architecture, thus there are 256 colors or less effectively. Lighting is done by remapping a color index to the index of a darker color, using a lookup table. This limits the number of light levels to probably 32, as with DOOM. The actual resolution is less, as color ramps in the Quake color table are limited to 16 or 8 shades of a primary. The lookup table is precalculated and was called COLORMAP with DOOM.


Quake uses a couple of precalculated lookup tables to speed up computational expensive rendering tasks. Precalculation requires the scenery to be static. The lighting is mostly fixed, and this part is precalculated. The Lightmaps could be considered as Alpha-maps implying alpha-blending with a black texture. However, the Lightmaps are coarse, calculated on a world axis aligned grid (see Naturally Aligned Textures). The grid is 16 x 16 units (texels) on Axial Planes, and will be mapped to about 22 pixels in worst case. Between grid points, the lighting values are linearly interpolated. Thus, despite the ray casting, there are not hard shadow edges possible, as each change is softened across those 16 pixels at least. This prevents aliasing as well, as Lightmaps are not mipmapped. However, with a deep field of view, aliasing because of lighting is possible, thus large areas should not have small scale variations of lighting.

Scripting ambient light level

The ambient light level added to the light map might be controlled by QuakeC scripts. This is not dynamic lighting (see below), as is not or not necessarily controlled by moving lightsources. It is just a simple way to change the ambient lighting, affecting entire surfaces or areas.

From a report by Tom Mustaine on how the light scripting works: Light levels for quake light sources are represented in letters, from A-Z (A-Something.) Where 'A' would represent darkest, 'M' would represent mid level lighting, and 'Z' brightest. Lighting effects are sequenced via QuakeC and can be done in just about any way imaginable. If you want a light to smoothly pulse, you set up a repeating sequence in QuakeC and assign each custom sequence to your lights.

Examples: A smoothly pulsating light, from total dark, to Mid level light could look like this "ABCDEFGHIJKLMLKJIHGFEDCB." Repeated via QuakeC. If you want a light to blink on and off quickly via switch, "ADGJM" would be the on state for a quick, yet smooth dark to mid-light transition, or "MA" for a instant darkness switch.

Dynamic lighting

The Quake engine has a limited ability to do dynamic lighting. To understand how this is done, you have to have a working knowledge of how the surfaces are created from polygons, textures and light maps (see above). The dynamic lighting is done at the surface cache level, where every light sample on the grid has the dynamic lights intensity - distance added in. Having the entire scene dynamically lit cuts the game speed by about 30%, so it is mostly used for effects and accents.

It is not know how many dynamic lights are taken into account. Note that this means for each change of the dynamic light sources, the surface has to be processed anew.

The Searchlight Effect

Note that Quake does not to real time shadows. That means that the dynamic lighting will not take into account any obstacles between light source and surface. A light source behind a thin wall will create a light spot on a wall that is facing away from the light source. There is no backface culling done, simply because detecting wall that are facing away does not mean floors and ceilings in adjacent rooms will be culled as well. In other words, backface culling would remove searchlight artifacts on surfaces that are not facing the light source, but not on any other surfaces - you will not see continuous light spots, but instead leakage on the floors only.

Thus dark areas should be separated from areas in which light sources will continuously be spawned by sufficiently thick walls - the spotlights diminish rapidly with increasing distance.

The searchlight artifact might even be used for special effects. Imagine a texture resembling dark, colored glass, on a thin wall. Any firefight behind that wall will cast shine-through lights on the otherwise opaque surface - a special effect that provides the illusion of translucency which you will not find in Quake.


Objects are sometimes called Things or (mistakenly) Entities. Throughout this document, and Object is the actual representation within the game: a bunch of polygons created from a collection of brushes (a BSP object), from an Alias model (MDL object), or represented by a so-called sprite (billboard object). For each Object, there has to be one Entity, i.e. Objects are one to one mapped to Entities. Note that not all Entities are mapped to Objects. The Entity is subject to QuakeC, and represents the internal state, the Object is the visual representation displayed by the Quake renderer.

In the following section we are solely dealing with visual appearance and the definition of the Objects. We are not talking about Entities, simulation, and QuakeC.

BSP Objects

Important, as they are treated like world polygons: they are occluders. This is most important for doors. They provide means for level design to limit the range of view, and have an impact on frame rate.

A BSP object does not change its geometry. It has no animation frames, and is usually useless for articulated bodies that move. BSP objects represent items (weapons, boxes), parts of the world that move (platforms, doors, trains). They could be used to represent complicated interiors. Note that BSP objects are not walkable, and that collision detection is done using bounding boxes.

MDL Objects

These are the key-framed animations of players and monsters that are created using Alias. These Objects consist of a list of triangles. For each frame of movement, there has to be a complete description of how to rearrange the triangles. It might be possible to create simple keyframed animations, however a replacement of a monster or player Object will require professional tools for keyframed animation based on Inverse Kinematics, and conversion from file formats like DXF or 3DS. There are reasonably priced tools like Martin Hash's 3D Animation, but the learning curve is high, and conversion to MDL is not straightforward.

MDL Objects are rendered using a z-buffer. This has several advantages, as any arrangement of triangles is possible: there is no need to restrict to convex polyhedra, or avoid intersecting surfaces.

Billboard Objects

Billboard Objects are partly transparent 2D textures that are mapped on a 3D Plane. Period. In DOOM and Quake descriptions, Billboard Objects are commonly called sprites, but two sided partly transparent walls are Billboard Objects as well. Billboard Objects are thus important with respect to WAD conversion issues.

The so-called DOOM sprites are a set of 5 or 8 Billboards, one of them selected depending on the current angle of view of the Object: the Rotations. Animated objects use several Frames. There are a lot of single rotation, and even single frame, sprites in DOOM. This worked well, as the restrictions the DOOM renderer imposed on the view ensured a Billboard worked for objects that could roughly be represented by a cylinder (e.g. a torch). In a 3D world, spherical symmetry would be required in general.

Occasionally, someone claimed that Billboard Objects will not be supported in the final release of Quake. Quoting John Carmack: "They are going to stay in. All of the sprite types you mention are currently supported, and there is also a sprite type that always has the XY normal pointing towards the view, which is slightly different than parallel to the view plane."

Supported billboard objects:

Quake does not support Rotations. Objects like torches do not have spherical symmetry, and level design should try to put them at locations that to not allow for a look straight down.

Applications: There are, after all, objects with sperical symmetry that will be memory and time intensive if approximated by polygon models: light bulbs and roughly spherical light sources, illuminated ballons. In addition, there are very irregular and detailed objects with nearly no symmetry: cloud-like explosions, fireballs. Trees might be composed out of several intersecting billboards. Billboard objects with cylindrical symmetry (flames, torches, pipes) might be used in areas in which scene geometry restricts the observer's movements and perspective. Partly transparent walls allow for rotating fans, detailed grates, stained glass.

However, note that Billboard Objects are unfortunately not mipmapped. This means that, in most cases, they will stick out visually and look startingly non-3D, and this will restrict their use. It is not recommended to replace a partly transparent grate wall from DOOM with a Quake Billboard Object - created approriate brushes instead.

Editing Tools


See QuakeEd pages.

Using DOOM editing tools

The fastest approach to a Quake-compliant editing tool is using DEU or another DOOM editing utility, and a 2D BSP builder. The technical details of creating a 3D BSP from a 2D BSP are described in the WAD conversion pages. The idea is to keep the GUI (and, implicitely, the restrictions on floors and ceilings) for now, and follow a migration path to an editor that is fully 3D internal, and will be able to make use of qbsp even while the GUI is not capable of 3D editing.

Importing WAD files

This requires WAD to MAP conversion.

Importing other scene descriptions

Importing DXF or 3DS is probably a good idea. There are two related issues: the building block description based on Brushes, as used in MAP files, and the uncommon natural texture alignment as used by Quake qtest1. The former has to be handled for conversion of DOOM levels as well.

Conversion of DXF, 3DS etc. to Brushes will require some thinking. I imagine that polygon based descriptions will have to be extended to building blocks again. Usually, polygon based descriptions provide only the inside of a convex polyhedron. It might be difficult to identify which polygon belongs to which polyhedron. Once all polygons of a given polyhedron are identified, one might determine the center point, and extend every vertex by one or two units on a ray from the center. This will generate a very thin building block with NoOf_Edges+2 planes from each polygon, and NoOf_Polygons building blocks that will just touch each other.

Development Tools

Lookups for Rendering

There are several lookups to be generated for speedy rendering of a given world. This is done by tools reading a MAP file, and creating a BSP file. It is important to understand the related concepts in order to exploit the Quake renderers capabilities to the fullest while avoiding putting to much workload on the hardware.

These lookups are discussed separately in this document. Some (like the Color LUT or the Lightmaps) have to be understood in detail for good level editing, others (like the PVS) should be unterstood in principle, or your level might exceed the capability of both tools and renderer.

Quoting John Carmack: "I definitely intend to release the utility suite that processes the editor output into game .bsp files. There are three utilities: "qbsp", "light", and "vis". These are very compute intensive tasks (especially vis). Doing the full blown computation on a large map can take over ten minutes, but there are many options to make the development turnaround time faster. "light" and "vis" are fully multi-threaded, so those of you with access to SMP systems can put them to contructive use. I bet a lot of university computers are going to start seeing some heavy loads :-)." According to him, they run those tools on a 4 CPU alpha box with 128 MB RAM, satan.idsoftware.com, remotely launched from QuakeEd.

During development, subsets of the suite are run, and only for full testing is the vis program run. From the quake.qpr Project file released with the QuakeEd sources:

	{"bspfullvis"	"qbsp $1 $2 ; light -extra $2 ; vis $2@"}
	{"bspfastvis"	"qbsp $1 $2 ; light  $2 ; vis -fast $2@"}
	{"bspnovis"	"qbsp $1 $2 ; light $2@"}
	{"bsprelight"	"qbsp -onlyents $1 $2 ; light -extra $2@"}
	{"bspleaktest"	"qbsp -mark -notjunc $1 $2 ; light $2@"}
	{"bspentities"	"qbsp -onlyents $1 $2@"}
As can be seen from this sequence, the calculation of Lightmaps did not use PVS.

BSP Builder

The qbsp utility builds minimized drawing and clipping hulls from a solid geometry description of the world: i.e. creates BSP files reading the MAP files. This is the memory pig that can chew up 50+ resident megs on big levels. Single threaded, but most levels only take a minute or so to run. It has the following options:

Default is a full BSP. The entity only processing step is useful during the placement of light sources (for lightmap calculation), and during placement of monsters and powerups.

Inclusion of Mipmapped Textures

Levels carry their own textures with them. This bulks the file up, but it makes introducing new graphics much simpler. Earlier on id used a IWAD wadfile (similar to the DOOM format) to hold a palette of textures used to create a level (persumably including TEXTURES and PNAMES). Now the mipmapped textures are procesed only once and stored in a WAD2 file. The worldspawn entity has a "wad" key that specifies the location of that WAD2 file. The qbsp utility extracts the textures that are actually used from that, and puts them in the output BSP file.

Sealing the Map

A MAP file does not necessarily have a closed hull or border. There might be small gaps, resonably sized openings and bulky holes to empty space. However, the basic idea of Quake-style rendering is restricting yourself to a densely occluded environment. Starting the the PVS calculation, the World has to be sealed. This means putting a shrink-wrap around all the Brushes (might well be a convex hull), defining an Inside, removing all Facets/Polygons that are not visible from the Inside, and putting a Sky in front of all remaining gaps.

Closely related is the issue of small gaps, as created by nearly adjacent Brushes. Small gaps will look like texture mapping errors, and, worse, they will increase the PVS size, because in worst case you will be able to see yet other BSP leaves though this gap. Preprocessing to identify small gaps and propose removal by the user is recommended to fix this.

However, John Carmack's qbsp utility does not fix any small gaps, but it removes all the crud on the outside of the level. This reportedly more than halves the data size. After building the BSP tree of the world, he builds a graph of the portals between the leaf nodes, then does a simple graph flood fill from the outside of the world. If it ever reaches a leaf with an entity (light, player, etc) in it, it assumes that it leaked inside the level and writes a pointfile containing a trail of dots that can be loaded into quake to find the path that went through the hole.

Note that the portals graph is not in the final BSP. It is supposedly stored separately, and read by the PVS builder that uses portals. From the above description John Carmack provided it seems that Quake has certain level debug capabilities, like loading and displaying said pointfile.

Merging of Planes

"qbsp" will handle two brushes that do not intersect, but merely touch. Contacting faces are merged out of existence. In case of a brush with zero thickness (planes are identical, normals have opposite directions): "I'm pretty sure that a brush that has no thickness in a given dimension would clip itself out of existance and not show up at all in the bsp file, but I'm not positive."
The equivalent of a two-sided wall is a billboard object/entity, not a brush.

PVS Builder

The vis utility builds the potentially visible set. Fully parallel. This is the CPU pig. It is wildly content dependent (combinatorially sensitive), so some levels may only take a minute or so, but there is one id level that takes nearly an hour even on 4 cpus. It has the following options:

Default is a full PVS calculation. The faster processing will be useful during changing the world geometry. There should be possibilites for really crude but fast approximations, based simply on leaf to leaf distance. Basic ideas on the processing are related to the concept of Portals, and could be found e.g. in Seth Tellers PhD thesis.

Lightmap Builder

The light utility builds the surface lighting maps. Fully parallelel, generally taking a few minutes to make a high quality run, or less for a non-filtered version. This is done with ray-casting from each light source position to each grid point on each surface. Radiosity and area light source calculation with penumbrae are not done, but third party tools might do this. It has the following options:

The default seems to produce a less detailed result.

As has been noted above, the lightmap provessing is not done PVS-aware. Processing lightmaps might be sped up significantly using the PVS, but as PVS calculation is the most time-consuming step anway, one would need both - a lightmap builder that does not need a PVS during the stages where placing lightsources does affect level layout, and a PVS-aware lightmap generation that might even allow for near-interactive placement of lightsources once level geometry (and PVS) are fixated.

Mipmap Builder

The mipmaps are created by qlumpy into new format WAD2 files. Those are only referenced by QuakeEd and qbsp, which copies the needed ones into the BSP file directly. According to John Carmack, they do occasionally take a lose in the mipmap creation process for lack of good colors in the indexed color palette, but they do an error diffusion during the resampling, which helps a lot.

PAK/WAD File Tools

None from id. Lots from others.

Editing Resources

Level editing will require additional resources: artwork. This means textures, brush templates and brush group templates, map files, MDL and BSP objects, billboards/sprites, and perhaps even palette and colormap lumps. There will be a lot of graphics provided by id Software, and John Carmack stated that the License will probably permit non-profit redistribution. This does not include derivative work based on id graphics, though. Using DOOM shareware textures and sprites does not work well because the palettes are very different. Modification of DOOM artowkr to make it fit might as well be prohibited.

Any collection of artwork will have to be organized in a couple of files in one of the following file formats, ot be accessible for editing and development tools.

WAD Files

There are a couple of WAD files. A WAD2 file gfx.wad has been released with qtest1/id1.pak. The following WAD files are named in the quake.qpr Project file released with the QuakeEd sources:

Finally, there is of course the possibility to use the shareware DOOM WAD file to create a WAD2 file.

MAP files

I will list a few MAP files here that supposedly exist or existed, as examples. The first file to be released has been johnc99.map, that requires a minor change, because the MAP file format changed sinced. It is a simple example room.

With the QuakeEd sources, jrbase1.map has been released, a complete world that has been shown at E3 and is supposedly part of the shareware. The following MAP files are named in the quake.qpr Project file released with the QuakeEd sources:

The QuakeEd sources mentions amlev1.map in an obsolete DEBUG statement. Released with qtest1 are the test1/test2/test3 levels, but only the test1 MAP files is available.

home qdev qdev dread rsc qdev doom license web
home dEr95 r3D dread netrsc quake doom legal web
Author: B., email to bernd@nero.uni-bonn.de, with public PGP key
Copyright (©) 1995, 1996 by author. All rights reserved.
Source: $Id: QPrimer.html,v 1.3 1996/08/15 08:29:00 b1 Exp $