Up until fairly recently, archaeology of the western hemisphere stopped at about 13,000 years ago. Since the discovery of the beautiful and finely worked Clovis points in 1929, and subsequent discoveries of Clovis technology across the United States, archaeologists generally adopted the “Clovis First” belief, that whoever created these tools must have been the first humans to populate North America.

Over the last few decades, however, a series of dramatic discoveries have pushed the estimated arrival by humans in the Western Hemisphere further and further into the past. Dates that were once considered only on the fringes of academic archaeology are now being discussed seriously within the mainstream.

The further back our archaeological reach, however, the more difficult it can become to identify human-made artifacts.

“We have a huge number of sites…that only have these other kinds of artifacts,” says archaeologist Bill Andrefsky, “dated to 20,000, 30,000, with Calico, 150,000 years.”

Calico Early Man Site, located in the Mojave Desert, has generated claims of extreme antiquity. Current consensus is more cautious.

Andrefsky, an expert in lithic (stone tool) analysis, has developed a series of controlled tests for differentiating between human-created artifacts and similar objects. His recommendations were recently published in the proceedings of the “Paleoamerican Odyssey” archaeological conference last fall.

His protocol, he says “comes up with results that say these characteristics are uniquely human, these characteristics are uniquely non-human, or these characteristics are both human and non-human.” More important, his research isolates the environmental contexts and conditions that influence production of human and non-human made specimens.

There is no question regarding the human origins of the beautifully crafted Clovis tools and those of later cultures. But much of the analysis of a human-occupied archaeological site depends on the byproduct of tool production, the stone chips and failed attempts that collected around a tool-making place, referred to as “debitage.”

The favorite material for stone tool craftsmen is brittle, hard rock such as chert (or flint) and obsidian. Such rocks can be cracked to create extremely sharp edges for cutting, scraping, and other uses. However, the properties that make them suitable for tool-making also make them susceptible to natural fracture, through erosion, frost-cracking, and animal trampling.

“When we started out in stone tool analysis 30 years ago,” says Andrefsky, “the books all said if it had a bulb of force…or ripple scars on the flake, it’s a human-made artifact.

“Well, if you drop a piece of obsidian down the hill, it’s going to fracture with the … ripple marks and all that.”

Through an extensive series of experi-ments that included such procedures as dropping bags of obsidian from a height and trampling over chert laid over a variety of underlying soils, and comparing these to artifacts excavated from ancient sites, Andrefsky was able to distinguish fracture and wear traits caused by human and non-human forces.

Based on his experiments, Andrefsky developed a series of guidelines, based on context and morphology, for distinguishing between human-made and non-human-made objects.

Although archaeological analysis is never easy, the steadily expanding scope of investigation in the western hemisphere will continue to demand more precise analytical techniques such as Andrefsky’s.

“When I started out in this business 40 years ago, I thought everything was done, with nothing left to discover,” says Andrefsky, half joking.