Appendix A — Prerequisites
A.1 Knowledge
We assume basic familiarity with Python, ideally including its core scientific libraries such as NumPy, Pandas, Matplotlib, and Jupyter.
A.2 Hardware
This is a hands-on course, so please bring your own laptop and charger.
A mouse is strongly recommended, especially for tasks like image annotation.
A dedicated GPU is not required, though it may speed up some computations.
A.3 Software
You’ll need both general tools for Python programming and specific software required for the course, as detailed below.
A.3.1 General development tools
If you already have a working Anaconda or Miniconda installation and have used it to run Python scripts or Jupyter notebooks, you can likely skip the steps below.
To prepare your computer for Python development, we recommend following the Software Carpentries installation instructions, in particular:
- Bash Shell, to run terminal commands
- Git, including a GitHub account
- Python, via the conda-forge installer. Please make sure you install a Python version >= 3.12 (e.g. 3.12 is fine, 3.10 is not).
You’ll also need a code editor (IDE) configured for Python.
If you already have one you’re comfortable with, feel free to use it. Otherwise, we recommend:
- Visual Studio Code with the Python extension
- JupyterLab
A.3.2 For the SLEAP tutorial
Please install SLEAP following the official installation instructions.
For this workshop, use SLEAP version 1.3.4. Be sure to replace the default version number (e.g. 1.4.1
) in the instructions with 1.3.4
.
This should create a conda
environment named sleap
with the necessary dependencies. You can verify the installation by running:
conda activate sleap
sleap-label
This should launch the SLEAP graphical user interface (GUI).
A.3.3 For the interactive notebooks
You will also need a separate conda
environment with everything required for the interactive exercises, including the movement and jupyter packages.
We recommend cloning this workshop’s repository and creating the environment using the provided environment.yaml
file:
git clone https://github.com/neuroinformatics-unit/animals-in-motion.git
cd animals-in-motion
conda env create -n animals-in-motion-env -f environment.yaml
To test your setup, run:
conda activate animals-in-motion-env
movement launch
This should open the movement GUI, i.e. the napari image viewer with the movement
plugin docked on the right.
There are other ways to install the movement package.
However, for this workshop, we recommend using the environment.yaml
file to ensure that all necessary dependencies, including those beyond movement
, are included.
A.4 Data
Bringing your own data is encouraged but not required. This could include video recordings of animal behaviour and/or motion tracking data you’ve previously generated.
We also provide some example datasets for you to use during the workshop. Please download these from Dropbox before the workshop starts (they are a few GB in size).