Analysis of Bone Counts by Maximum Likelihood: A Package of Computer Programs

Alan R. Rogers1


Contents


1. Introduction

ABCml stands for Analysis of Bone Counts by Maximum Likelihood. The term refers to three different things:

  1. A statistical method that was introduced by Rogers (2000a) and has been used in several subsequent publications (Rogers, 2000b,c; Rogers and Broughton, 2000).
  2. A computer program that implements this method.
  3. A package of computer program that includes ABCml-the-program along with several other programs that play supporting roles.
This document describes ABCml-the-package, and includes a discription of ABCml-the-program. It also includes a brief description of ABCml-the-statistical method, but you will have to look elsewhere for a full description of the method (Rogers, 2000a).

This introduction describes the goals and limitations of ABCml in general terms. Later chapters will fill in the details.

1.1 Goals

Animals are introduced into archaeological assemblages in various ways, each of which will be referred to as an ``agent of deposition.'' These might include natural deaths at the site, kills at the site by humans or non-human carnivores, transport to the site by hunters or scavengers of various species, and running water, and so on. The primary goal of ABCml is to estimate the relative contributions of different agents of deposition.

In order to achieve this goal, it is necessary also to estimate two additional parameters, which account for factors that also influence bone counts. The first of these factors is the number of carcasses originally deposited, and the second is the severity of destructive processes such as gnawing by carnivores.

ABCml provides both point estimates of these parameters and also standard errors.

1.2 Data requirements

ABCml requires, as input data, detailed information about the agents of deposition that are to be studied. The literature contains some data on bone transport by human and non-human hunters, but our knowledge of this issue is nonetheless woefully inadequate. In the absence of extensive data on bone transport, how can ABCml be used?

First, it is important to realize that this problem affects not only ABCml, but also all other methods of analysis. ABCml differs from other methods only in that it requires that we make our assumptions explicit and detailed.

Second, there is no special difficulty in incorporating conventional ideas about transport and attrition into ABCml. For example, there is a long history of analyses that are based on the idea that hunters will first discard those skeletal parts with greatest weight and lowest utility Perkins and Daly (1968); White (1954). Rogers and Broughton (2000) show how this qualitative idea can be incorporated into an analysis using ABCml. The results, of course, are no better than the assumptions, but that is true of any method. ABCml makes better use of its input data and violates fewer assumptions than any alternative method of analysis that is currently available. (For a critique of alternative methods, see Rogers (2000a), Rogers (2000c), Rogers (2000b), and Rogers and Broughton (2000).)

Third, there is a considerable body of empirical research regarding carcass transport by humans and other predators (Bunn et al., 1988; Marean et al., 1992; Binford, 1978; O'Connell et al., 1988). It is easy to incorporate the resulting data into analyses using ABCml.

In summary, ABCml makes optimal use of available data from all sources while forcing us to be explicit about our assumptions.

1.3 The programs

The ABCml package consists of the following programs:

ABCml
estimates parameters and their standard errors, provides analysis of residuals. If the input file contains a large number of data sets, ABCml will also provide quantiles of each estimated parameter.

ABCsim
generates bone-count data by computer simulation.

gnaw
subjects a data set to density-biased attrition.

tabcfg
tabulates the configurations in a .cfg file, eliminating duplicates.

mau2cfg
converts data from .mau format (which is common in published literature) into .cfg format (which is required by several of the programs in this package).

cplcfg
find the complement of a .cfg file. If a .cfg file describes bones that were transported from kill site to home-base, then its complement will describe the bones that were left in the field.


2. Input Files

This chapter describes the various kinds of input file that are used by the programs in this package. The type of a file is specified by its suffix:


  Suffix           File Type
  ------           ---------
   .bdf            Bone definition file
   .cfg            Agent configuration file
   .mau            Minimum Animal Units file
   .cnt            Skeletal part counts
   .wgt            Skeletal part weights
All of these formats are plain ascii. Comments may be included in any input file. A comment begins with the character # and ends at the end of the line. Comments are ignored by the input routines. None of the programs in the package use all of these types of input file, but several of them use more than one.


2.1 Bone Definition File (.bdf)

The bone definition file is used to describe the characteristics of skeletal parts. Example:


    #File toy.bdf
    2   #number of parts
    #label      live    density
    Skull       1       0.49
    Femur       2       0.37

The file toy.bdf defines the skeletal parts of a toy model in which there are only two elements: skull and femur. The first line of input says that there are 2 parts. The next line is a comment that is ignored by the input routine. After that, each line has three fields, which are separated by white space (blanks and/or tabs):

Field 1
the label of the skeletal part

Field 2
the number of this part in a complete skeleton

Field 3
the density of this part. The program assumes that a copy of skeletal part i survives attrition with probability exp[- {beta}A/di], where di is the density of the skeletal part, A is a scaling constant determined by the program, and {beta} measures the intensity of attrition. A is determined by the program so that when {beta} = 1, half the bones in an entire skeleton would survive attrition. Because of this rescaling, it doesn't matter what units density is measured in.


2.2 Agent Configuration File (.cfg)

The agent configuration file describes the characteristics of agents of deposition. Example:


    # file:  home.cfg
    2  #number of parts
    5  #number of configurations
    #probabilities of configurations:
             0.5   0.2   0.15    0.1    0.05
    #Configurations:
    #label              
    Skull    0     0     1       1      1
    Femur    1     2     2       1      0

An animal need not be deposited in the assemblage as an entire skeleton. It may be deposited as a skull only, as a skull and 1 femur, as 2 femurs and no skull, and so on. Each of these possibilities is called a ``configuration''. To describe an agent of deposition, the .cfg file specifies the configurations that are possible along with their probabilities. The lines are interpreted as follows:

Line 1
the number of skeletal parts

Line 2
the number of configurations

Line 3
numbers, separated by white space, giving the probabilities of the configurations. Although I refer to these as probabilities, they need not sum to 1, for they are normalized by the input routine. For example, the probabilities line for a data set with 5 configurations might look like

    1 1 2 1 1
which would imply that the probability of configuration 3 is twice as large as those of configurations 1, 2, 4, and 5.

Each succeeding line
has several fields, separated by whitespace. The first field is the label for a particular skeletal part. The remaining fields give the number of that part in configuration 1, configuration 2, and so on.

For example, home.cfg tells us that there are 2 parts and 5 configurations. The 1st configuration occurs with probability 0.5 and consists of 0 skulls and 1 femur.


2.3 Minimum Animal Units File (.mau)

In published literature, one often finds skeletal part counts expressed as Minimum Animal Units (MAUs). These are equal to the number of distinct copies of each part divided by the number of copies of that part in a living animal. The .mau file format provides the same information as the .cfg file format except that skeletal parts are expressed as MAUs rather than as raw counts. The format is identical to the .cfg format except that configurations are columns of floating-point numbers rather than columns of integers. Use the program mau2cfg to convert from .mau format to .cfg format.


2.4 Skeletal Part Count File (.cnt)

The skeletal part count file is read by several of the programs in the package. In addition, abcsim produces output is in the form of a .cnt file so that it can be used as input by other programs. This file provides the counts of skeletal parts from one or more assemblages. Here is an example:


    # file: toy.cnt
    2 #number of parts
    2 #number of data sets
    #label              DS0  DS1
    Skull               537  549
    Femur               942  983
The first two lines give the number of skeletal parts and the number of data sets. Each succeeding line has several fields separated by whitespace, which are interpreted as follows:

Field 1
skeletal part label

Field 2
count of this part in the 1st data set

Field n
count of this part in the (n-1)th data set


2.5 Weight File (.wgt)

The weight file format is read only by the wgtcfg program. It allows wgtcfg to generate a .cfg file that is consistent with the assumption that configurations that are high in energetic value but easy to carry are most likely to be transported. Its format looks like this:


  24 # number of parts
  #part               MUI    GrossWgt
  half-mandible       295.00   889.00 # without tongue; divisor=2
  atlas/axis          262.00   315.00 # divisor = 2
  cervical_vert       272.14   301.71 # divisor = 7
  thoracic_vert       187.15   214.62 # divisor = 13
  lumbar_vert         284.33   323.33 # divisor = 6
  innominate          843.67  1058.88 # = pelvis/3
  sacrum              843.67  1058.88 # = pelvis/3
  rib                 101.92   141.81 # divisor = 26
  scapula            2295.00  2398.00 # divisor = 1
  P_humerus           743.00   830.50 # divisor = 2
  D_humerus           743.00   830.50 # divisor = 2
  P_radius            377.50   459.50 # divisor = 2
  D_radius            377.50   459.50 # divisor = 2
  carpal               33.50    46.75 # divisor = 8
  P_metacarpal         33.50    46.75 # divisor = 8
  D_metacarpal         33.50    46.75 # divisor = 8
  phalange             15.67    24.67 # divisor = 6
  P_femur            2569.50  2671.00 # divisor = 2
  D_femur            2569.50  2671.00 # divisor = 2
  P_tibia             218.33   255.33 # divisor = 6
  D_tibia             218.33   255.33 # divisor = 6
  tarsal              218.33   255.33 # divisor = 6
  P_metatarsal        290.50   377.00 # divisor = 2
  D_metatarsal        290.50   377.00 # divisor = 2

As usual, the first column gives the labels of the skeletal parts. The second column gives some measure of the utility of the part. In this example I have used the ``meat utility index'' (MUI) of Metcalfe and Jones (1988, p. 489). It is simply the weight (in grams, as I recall) of the meat attached to the bone. The third column gives the gross weight (again, I think, in grams) of the skeletal part.

There is one minor complication. Published data provide weights not for individual skeletal parts but for various packages that are thought to be transported together (Metcalfe and Jones 1988, p 489; Binford 1978, pp 16-17). For example, the values published for ``ribs'' refer to the entire rib cage, but those for ``femur'' refer to a single femur (not the two femurs together). Both the MUI and the GrossWgt values must be divided into components that reflect the skeletal parts used in the analysis. If the ribs always travel together, then it doesn't matter whether we assign the entire weight to a single rib and assign zero weight to the other ribs, or whether we assign equal weight to each rib. The latter choice is simpler, however, so that is what I have done here. Since there are 26 ribs, I divided the published ``rib'' values by 26. Since the distal and proximal ends of the femur are tabulated separately, I divided the published ``femur'' values by 2. These divisors are given in comments at the end of each input line to help me remember what I did.


3. ABCml

3.1 General

Abcml is a program for Analysis of Bone Counts by Maximum Likelihood. Two versions are available, one written in C and the other written in Java. The C version is much faster. The Java version is slower, but has two advantages. In the first place, there is no need to install the software. It can be executed by pointing a web browser at http://mombasa.anthro.utah.edu/alan/abcml. In the second place, the Java version has a graphical user interface as well as a command-line interface. Although many people prefer a graphical user interface, the command-line interface of ABCml is in fact much faster and easier to use.

3.2 Input data

Whichever interface you use, you will need to provide the several kinds of input data. Each data set that you provide can either be read from a file or (under the graphical user interface) typed or pasted into a dialog box. If you are running the Java version from a web browser, the web browser will probably not allow file input. In that case, the program will prompt you for data, which can either be typed or pasted into the dialog boxes that are provided. The data sets required by ABCml are as follows:

  1. Bone definition data, which describe the characteristics of the skeletal parts to be analyzed. If these data are read from a file, the file's name must end with .bdf.

  2. One or more sets of agent definition data, each of which describes an agent of deposition. If these data sets are read from files, each file name must end with .cfg.

  3. Skeletal part count data, which contain the skeletal part counts on which the estimates are based. If these data are read from a file, the file name must end with .cnt.

3.3 Command-line options

In addition, the command-line interface recognizes the following command line options:

-a
Estimate attrition (beta)? Default: Yes

-C
Print matrix of sampling covariances. Default: No. (C version only.)

-D x
Set sensitivity to x/density. This option makes it possible to perform several analyses with the same set of sensitivity values. Default: Sensitivities are automatically scaled as described below. (C version only.)

-e x
Set size of initial simplex. It may be useful to change this when the program fails to converge, or when different runs converge to different answers. Default: 0.1. (C version only.)

-L
Print lnL? Default: No. (C version only; the Java version always prints lnL.)

-p
Toggle that controls whether the program uses principal components analysis to reduce the dimension of the problem. When PC is not used, dimensions are reduced by successively lumping skeletal parts with high correlation. The default is to use principal components. (C version only.)

-r
Before dimension reduction, the program must calculate a pooled covariance matrix by averaging the matrices of the various agents of deposition. By default, an unweighted average is used. This flag instructs the program to use a randomly-weighted average. (C version only.)

-w i
Use only column i of data in .cnt file.

-v
Toggle verbose mode. Default: Off. (C version only.)
The program uses the method of maximum likelihood to estimate the following parameters:
{kappa}
Think of kappa as the number of animals originally contributed to the assemblage. In fact, K, is the number originally contributed and kappa is the expected value of K. In practice, an estimate of {kappa} is an estimate of K with a conservative confidence interval.

{beta}
The intensity of attrition. Skeletal part i survives attrition with probability exp[- {beta}si] where si measures the sensitivity of part i to attrition. The sensitivity measure is si = A/di, where di is the density of part i (as given in the .bdf file) and A is a constant of proportionality. By default, A is chosen so that when {beta} = 1, half the bones in a complete skeleton will survive. Alternatively, A can be set using the -D option (see above).

{alpha}
A vector whose i'th entry, {alpha}i, is the fraction of the assemblage representing contributions by the i'th agent of deposition. Since the entries of alpha must sum to unity, there are j {alpha} parameters to estimate when the number of agents is j + 1. If only one agent is specified, no {alpha} parameters are estimated.

When the -a flag is set, {beta} is not estimated.

3.4 Examples

Given data files such as those in the toy directory of this distribution, {kappa} and {alpha}0 can be estimated with the C version of the software by typing:


  abcml toy.bdf home.cfg kill.cfg toy.cnt -a
Using the command-line interface of the Java version, you would type

  java Abcml toy.bdf home.cfg kill.cfg toy.cnt -a
This produces:

  #Cmd line: abcml toy.bdf home.cfg kill.cfg toy.cnt -a
  #Assuming that attrition is absent.
  #Output is                  : not verbose
  #Number of agents           : 2
  #Number of parameters       : 2
  #Number of skeletal parts   : 2
  #Sensitivity to attrition   : 0.28084047551266139164 / density
  #Initial parameter vector is: fixed
  #F mean gives equal weight to each agent
  #
  ### Dataset 1
  #Initial params: kappa=1074 alpha[0]=0.5
  #Using 2 / 2 dimensions.
    rowlbl  mni        kappa     alpha[0]     alpha[1]        ChiSq
  Estimate  537   997.356763     0.405506     0.594494     0.001209
    StdErr  ***    35.503345     0.028033          ***          ***
  
  # Residuals:
  #label                 y           mu         y-mu            Z
  #Skull               537   536.376116     0.623884     0.026938
  #Femur               942   940.810160     1.189840     0.033215
  #
  ### Dataset 2
  #Initial params: kappa=1098 alpha[0]=0.5
  #Using 2 / 2 dimensions.
  # rowlbl  mni        kappa     alpha[0]     alpha[1]        ChiSq
  Estimate  549  1031.067762     0.420381     0.579619     0.001166
    StdErr  ***    36.098974     0.027674          ***          ***
  
  # Residuals:
  #label                 y           mu         y-mu            Z
  #Skull               549   548.370883     0.629117     0.026865
  #Femur               983   981.812260     1.187740     0.032386
The first few lines of output describe the options in effect, and I will discuss these in a moment. The remaining lines give estimates for each of the two data sets in file toy.cnt. The columns of output may include any or all of the following:
mni
Minimum number of individuals, a crude (and badly biased) measure of the number of animals contributing to an assemblage.

kappa
maximum likelihood estimator (MLE) of {kappa} (defined above)

alpha[i]
MLE of the fraction of {alpha}i (defined above)

beta
MLE of {beta}, the intensity of attrition (defined above).

ChiSq
Measures how well the model fits the data. When the model fits well, ChiSq is small. It is calculated as

ChiSq = (y - {mu})'C-1(y - {mu})

where y is the vector of bone counts, {mu} is its expectation under the model, and C is the covariance matrix. In practice, it is usually impossible to invert the full matrix C, so the dimension of this problem is reduced by principal components analysis before this calculation is done. The resulting statistic is approximately Chi-squared, with degrees of freedom equal to the number of dimensions in the reduced version of C. The number of dimensions is printed out just before the estimates. Since the ChiSq statistic is only approximately Chi-squared, it is best to test hypotheses using a sampling distribution inferred from computer simulations. For this purpose, use abcsim to generate simulated data sets, and then analyze this simulated data using abcml.

lnL
The natural log of likelihood. This is not a good measure of fit to the model because the analysis is carried out using principal components rather than the raw counts of skeletal parts. Each analysis generates its own principal components, so the lnL values from different analyses may not be comparable. The ChiSq statistic is a more useful measure of goodness of fit.
These columns do not always appear. Their appearance is controlled by command line arguments or by the graphical interface.

In the example above, the estimates of {kappa} are close to 1000 as they should be, since 1000 animals contribute to both of the simulated data sets in file toy.cnt. The values listed under alpha[0] and alph[1] are estimates of parameters {alpha}0 and {alpha}1, which measure the contributions of the two agents of deposition that were specified when the program was run: home.cfg and kill.cfg. In the simulations that generated these data, {alpha}0 = 0.4 and {alpha}1 = 0.6. {kappa} is also close to its simulation value of 1000.

In some cases, the agent of deposition will not be in doubt and we are interested only in the number of animals represented and in level of attrition. In such cases, list only a single .cfg file.

3.5 How to make the method fail

  1. The method will fail when the number of parameters exceeds the number of skeletal parts. It may appear that this is the case in the example above, since the output lists 4 values but there are only 2 skeletal parts. But there are really only two independent parameters here. alpha[0] and alpha[1] only count as one parameter, since alpha[1] is equal to 1-alpha[0]. The likelihood isn't a parameter at all, but rather a measure of goodness of fit. Thus, we really have just 2 parameters here. Had I failed to give the -a flag, however, the program would have tried to estimate beta. When this happens, the program notices and generates an error message.

  2. The method will fail if there are several agents of deposition, one of which has only a single configuration.

3.6 Notes on the implementation

Parameters are estimated using the method of maximum likelihood. The details of the method can be found in Rogers (2000a). To find parameter values that maximize the likelihood, the current computer program uses the downhill simplex method of Nelder and Mead, as implemented by Press et al. (1992). This algorithm is among the least efficient available, but it is simple and easy to implement. If this program turns out to have wide interest, it will be worth implementing a more efficient maximization routine.


3.7 ABCSIM


ABCSIM Documentation File

Abcsim is a program that generates one or more simulated archeological data sets using data from input files and from the command line. It requires the following input files.

  1. A bone definition file, whose name ends with .bdf. This file describes the characteristics of the skeletal parts to be analyzed.

  2. Two or more agent definition files, each having a name ending with .cfg. Each of these files describes an agent of deposition.

The format of these files is described in files.

In addition, the program recognizes the following command line options:

-a x,x,...,x

Sets the vector of alpha values to the comma-separated string of numbers given as an argument. alpha [i ] is the fraction of the assemblage attributable to the i'th agent of deposition. By default, all the entries of alpha have the same value.

-b x

Set beta, the attrition parameter. This determines how strongly the simulated assemblages will be affected by attrition. When beta=0, no bones are lost to attrition. When beta=1, half the bones in a complete skeleton would be lost. By default, beta=0 so that no bones are lost.

-K x

Set number of animals in assemblage. Def: 100

-n x

Set number of simulated datasets. Def: 1

For example, to generate two data sets using the input files in the toy directory, and with beta=.3, type:


  abcsim toy.bdf home.cfg kill.cfg -n 2 -b 0.3

This generates two random data sets, which are written to output in .cnt format. Here is a sample set of output:


    2 #number of parts
    2 #number of data sets
  #label              DS0  DS1
  Skull                37   49
  Femur                88   84

The simulations are done by generating alpha[i]*K animals from agent i. Each animal is generated by choosing a configuration with the probabilities given in the relevant .cfg file, and then adding the relevant skeletal parts to the data set. Then, each skeletal part is exposed to attrition. A skeletal part of type j survives attrition with independent probability


     exp(-beta*s[j])

where s[i] is the sensitivity of the jth skeletal part and is proportional to the reciprocal of the density of this skeletal part, as given in the .bdf file. The constant of proportionality is adjusted so that on average, half the bones in a complete skeleton would survive when beta=1.


3.8 CPLCFG


CPLCFG Documentation File

Cplcfg finds the complement of a .cfg with respect to the number of bones in a live animal. If home.cfg describes the bones that are transported home, then cplcfg can be used to calculate a description of the bones that were not transported.


  Usage: cplcfg filename.bdf filename.cfg

where filename.bdf is a bone definition file (in .bdf format) and filename.cfg is an agent configuration file (in .cfg format). The output is another .cfg file. Given this .bdf file:


  #################### toy.bdf #####################################
  2   #number of parts
  #label          live    density
  Skull           1       0.49
  Femur           2       0.37

and this .cfg file:


  #################### home.cfg ##################################
  2  #number of parts
  5   #number of configurations
  #
  #probabilities of configurations:
                0.5   0.2   0.15    0.1    0.05
  #
  #Configurations:
  #label 
  Skull         0     0     1       1      1
  Femur         1     2     2       1      0

the following command:


  cplcfg toy.bdf home.cfg

will produce the following output:


       2  # number of parts
       5  # number of configurations
  #Probabilities of configurations are proportional to:
                     10 4 3 2 1
  #label
  Skull              1  1  0  0  0
  Femur              1  0  0  1  2

which also in the form of a .cfg file. Note the configurations in the new .cfg file and those of the old one each sum to produce the number of bones in a live animal, as given by toy.bdf. Thus, if home.cfg describes the number of bones brought home from a kill site, then cplcfg tells us the number that were left at the kill site.


3.9 MAU2CFG


MAU2CFG Documentation File

Agents of deposition can be described in either of two formats. The .mau file format divides the number of each skeletal part by the number of that part in a living animal, and the .cfg uses raw counts. Many published data are .mau format, but the other programs in this package require .cfg format. Thus, it is often necessary to translate from .mau to .cfg and this program does that job. mau2cfg must also read a file in .bdf format, which describes the skeletal parts of living animals. See files for descriptions of these formats.


  Usage: mau2cfg filename.bdf filename.mau

where filename.bdf is the bone definition file and filename.mau the .mau file. These files must have extensions .bdf and .mau, respectively, but the filenames are otherwise arbitrary.

The program's output is in the form of a .cfg file.


3.10 TABCFG


TABCFG Documentation File

Tabcfg is a program that tabulates configurations. It reads a .cfg file and its output is in the form of a .cfg file (to find about .cfg files, see files). It examines the configurations in its input and eliminates duplicates. For example, consider the following file, named small.cfg:


  ############## small.cfg #######################################
  2   #number of parts
  5   #number of configurations
  #
  #probabilities of configurations:
                10  4  3   2   1
  #
  #Configurations:
  #label                
  Skull         0   0  1   1   1
  Femur         1   1  2   2   0

The 1st and 2nd configurations are identical as are the 2nd and 3rd. Running this through tabcfg produces the following output, which is also in the form of a .cfg file:


  #                                   TABCFG
  #                          Tabulate Configurations
  #                             by Alan R. Rogers
  #                                Version 0.10
  #                         Type `tabcfg -- ' for help
  
  #Cmd line: tabcfg small.cfg
  #Configured agent from file small.cfg
  #Input file: small.cfg
       2  # number of parts
       3  # number of configurations
  #Probabilities of configurations are proportional to:
                     14 5 1
  #label
  Skull              0  1  1
  Femur              1  2  0
  
The new file contains the same configurations as the old one, but
now each configuration is unique.  The probability of the i'th
configuration in the new file is the sum of the probabilities
corresponding to that configuration in the old file.   The new .cfg
file is smaller and easier to manipulate than the old one.  Using the
new, shorter, .cfg file as input to abcml or abcsim will generate
precisely the same output, but with some savings in computer time.

Bibliography

Binford, L. R. (1978).
Nunamiut Ethnoarchaeology.
New York: Academic Press.

Bunn, H. T., Bartram, L. E. and Kroll, E. M. (1988).
Variability in bone assemblage formation from Hadza hunting, scavenging, and carcass processing.
Journal of Anthropological Archaeology 7, 412-457.

Marean, C. W., Spencer, L. M., Blumenschine, R. J. and Capaldo, S. D. (1992).
Captive hyaena bone choice and destruction, the schlepp effect and Olduvai archaeofaunas.
Journal of Archaeological Science 19, 101-121.

O'Connell, J. F., Hawkes, K. and Blurton Jones, N. (1988).
Hadza hunting, butchering, and bone transport and their archaeological implications.
Journal of Anthropological Research 44, 113-161.

Perkins, D. and Daly, P. (1968).
A hunter's village in neolithic Turkey.
Scientific American 219, 97-106.

Press, W. H., Teukolsky, S. A., Vetterling, W. T. and Flannery, B. P. (1992).
Numerical Recipes in C: The Art of Scientific Computing.
2nd edition. New York: Cambridge University Press.

Rogers, A. R. (2000a).
Analysis of bone counts by maximum likelihood.
Journal of Archaeological Science 27, 111-125.

Rogers, A. R. (2000b).
On equifinality in faunal analysis.
American Antiquity In press.

Rogers, A. R. (2000c).
On the value of soft bones in faunal analysis.
Journal of Archaeological Science In press.

Rogers, A. R. and Broughton, J. M. (2000).
Selective transport of animal parts by ancient hunters: A new statistical method and an application to the Emeryville Shellmound fauna.
Submitted for publication.

White, T. E. (1954).
Observations on the butchering technique of some aboriginal peoples: Nos. 3, 4, 5, and 6.
American Antiquity 19, 254-264.

About this document ...

Analysis of Bone Counts by Maximum Likelihood: A Package of Computer Programs

This document was generated using the LaTeX2HTML translator Version 99.1 release (March 30, 1999)

Copyright © 1993, 1994, 1995, 1996, Nikos Drakos, Computer Based Learning Unit, University of Leeds.
Copyright © 1997, 1998, 1999, Ross Moore, Mathematics Department, Macquarie University, Sydney.

The command line arguments were:
latex2html -antialias_text -no_math -html_version 3.2,math -show_section_numbers -dir ../html -split +0 -toc_depth 3 -no_navigation main

The translation was initiated by on 2000-10-26


Footnotes

... Rogers1
Dept of Anthropology, University of Utah, Salt Lake City 84112. rogers at anthro dot utah dot edu



2000-10-26