Pottery is of fundamental importance for understanding archaeological contexts. However, recognition of ceramics is still a manual, time-consuming activity, reliant on analogue catalogues created by specialists, held in archives and libraries. The ArchAIDE project worked to streamline, optimise, and economise the mundane aspects of these processes, using the latest automatic image recognition technology, while retaining key decision points necessary to create trusted results. The project has developed two complementary machine-learning tools to propose identifications based on images captured on site. One method relies on the shape of the fracture outline of a sherd; the other is based on decorative features. For the outline-identification tool, a novel deeplearning architecture was employed, integrating shape information from points along the inner and outer surfaces. The decoration classifier is based on relatively standard architectures used in image recognition. In both cases, training the classifiers required tackling challenges that arise when working with real-world archaeological data: the paucity of labelled data; extreme imbalance between instances of the different categories; and the need to avoid neglecting rare types and to take note of minute distinguishing features of some forms. The scarcity of training data was overcome by using synthetically-produced virtual potsherds and by employing multiple data-augmentation techniques. A novel way of training loss allowed us to overcome the problems caused by under-populated classes and non-homogeneous distribution of discriminative features.

The automatic recognition of ceramics from only one photo: The ArchAIDE app

Francesca Anichini
Co-primo
Project Administration
;
Gabriele Gattiglia
Co-primo
Writing – Review & Editing
;
2021-01-01

Abstract

Pottery is of fundamental importance for understanding archaeological contexts. However, recognition of ceramics is still a manual, time-consuming activity, reliant on analogue catalogues created by specialists, held in archives and libraries. The ArchAIDE project worked to streamline, optimise, and economise the mundane aspects of these processes, using the latest automatic image recognition technology, while retaining key decision points necessary to create trusted results. The project has developed two complementary machine-learning tools to propose identifications based on images captured on site. One method relies on the shape of the fracture outline of a sherd; the other is based on decorative features. For the outline-identification tool, a novel deeplearning architecture was employed, integrating shape information from points along the inner and outer surfaces. The decoration classifier is based on relatively standard architectures used in image recognition. In both cases, training the classifiers required tackling challenges that arise when working with real-world archaeological data: the paucity of labelled data; extreme imbalance between instances of the different categories; and the need to avoid neglecting rare types and to take note of minute distinguishing features of some forms. The scarcity of training data was overcome by using synthetically-produced virtual potsherds and by employing multiple data-augmentation techniques. A novel way of training loss allowed us to overcome the problems caused by under-populated classes and non-homogeneous distribution of discriminative features.
2021
Anichini, Francesca; Dershowitz, Nachum; Dubbini, Nevio; Gattiglia, Gabriele; Itkin, Barak; Wolf, Lior
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S2352409X20305794-main.pdf

non disponibili

Descrizione: Long Paper
Tipologia: Versione finale editoriale
Licenza: NON PUBBLICO - accesso privato/ristretto
Dimensione 6.39 MB
Formato Adobe PDF
6.39 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1076558
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 11
social impact