Database

Development updates for the DBSSIN library

by Lee Napthine, 28 February 2025

Over the last year, we’ve been adapting DBSSIN (Database-Spreadsheet-Ingestion) to better fit current projects, and it’s come with its share of development challenges. The ARCsoft team has been pushing the library forward, cleaning up the mapping system, and handling some unique security and access issues.

Performance analysis of database caching versus in-memory caching

by Bhavy Rai, 5 April 2024

Django offers many caching strategies, but most notably supports in-memory caches like Memcached and Redis, or database-centric caching solutions interfacing directly with your pre-defined backend. However, there are a couple of considerations you might want to take into account when choosing a caching solution. Below, I will give an analysis of my findings after implementing both caching with Redis and PostgreSQL/SQLite.

Ingestion Made Easy!

by Priya Srinivasan, 11 August 2023

ZooDB already possessed an ingestion script, capable of efficiently processing hundreds of rows of zooarchaeological bone data. This script was traditionally executed from the command line by the developer. However, to enhance the user experience and streamline the ingestion process, we created a user-facing feature to allow the researchers to upload and process their data via the web application. In this article, I describe the crucial parts of this feature and how it was developed.

Basics of Openpyxl and Ingestion of data into SQLite database

by Priya Srinivasan, 28 March 2023

In this blogpost, we will be discussing the basics of the python library openpyxl, and about how to ingest an excel spreadsheet data into database such as SQLite.

Turning a Spreadsheet Into a Postgres Database

by Stephen Neale, 4 November 2022

Hello! Stephen here. From September to October I have been working on a database of excavations and bone counts for the archaeological department here at the University of Victoria–a project dubbed “ZooDB”. In this blog post, I will be going over how I went about one of my first tasks: automatically turning a spreadsheet of data into a Postgres database.