ibllib.io.extractors.video_motion
A module for aligning the wheel motion with the rotary encoder. Currently used by the camera QC in order to check timestamp alignment.
Functions
Classes
- class MotionAlignment(eid=None, one=None, log=<Logger ibllib.io.extractors.video_motion (INFO)>, stream=False, **kwargs)[source]
Bases:
object
- roi = {'body': ((402, 481), (31, 103)), 'left': ((800, 1020), (233, 1096)), 'right': ((426, 510), (104, 545))}
- static set_roi(video_path)[source]
Manually set the ROIs for a given set of videos TODO Improve docstring TODO A method for setting ROIs by label
- load_data(download=False)[source]
Load wheel, trial and camera timestamp data :return: wheel, trials
- align_motion(period=(-inf, inf), side='left', sd_thresh=10, display=False)[source]
Align video to the wheel using cross-correlation of the video motion signal and the rotary encoder.
- Parameters:
period ((float, float)) – The time period over which to do the alignment.
side ({'left', 'right'}) – With which camera to perform the alignment.
sd_thresh (float) – For plotting where the motion energy goes above this standard deviation threshold.
display (bool) – When true, displays the aligned wheel motion energy along with the rotary encoder signal.
- Returns:
int – Frame offset, i.e. by how many frames the video was shifted to match the rotary encoder signal. Negative values mean the video was shifted backwards with respect to the wheel timestamps.
float – The peak cross-correlation.
numpy.ndarray – The motion energy used in the cross-correlation, i.e. the frame difference for the period given.
- class MotionAlignmentFullSession(session_path, label, **kwargs)[source]
Bases:
object
- load_data(sync='nidq', location=None)[source]
Loads relevant data from disk to perform motion alignment
- Parameters:
sync – type of sync used, ‘nidq’ or ‘bpod’
location – where the code is being run, if location=’SDSC’, the dataset uuids are removed when loading the data
- Returns:
- get_roi_mask()[source]
Compute the region of interest mask for a given camera. This corresponds to a box in the video that we will use to compute the wheel motion energy :return:
- find_contaminated_frames(video_frames, thresold=20, normalise=True)[source]
Finds frames in the video that have artefacts such as the mouse’s paw or a human hand. In order to determine frames with contamination an Otsu thresholding is applied to each frame to detect the artefact from the background image
- Parameters:
video_frames – np array of video frames (nframes, nwidth, nheight)
thresold – threshold to differentiate artefact from background
normalise – whether to normalise the threshold values for each frame to the baseline
- Returns:
mask of frames that are contaminated
- compute_motion_energy(first, last, wg, iw)[source]
Computes the video motion energy for frame indexes between first and last. This function is written to be run in a parallel fashion jusing joblib.parallel
- Parameters:
first – first frame index of frame interval to consider
last – last frame index of frame interval to consider
wg – WindowGenerator
iw – iteration of the WindowGenerator
- Returns:
- compute_shifts(times, me, first, last, iw, wg)[source]
Compute the cross-correlation between the video motion energy and the wheel velocity to find the mismatch between the camera ttls and the video frames. This function is written to run in a parallel manner using joblib.parallel
- Parameters:
times – the times of the video frames across the whole session (ttls)
me – the video motion energy computed across the whole session
first – first time idx to consider
last – last time idx to consider
wg – WindowGenerator
iw – iteration of the WindowGenerator
- Returns:
- clean_shifts(x, n=1)[source]
Removes artefacts from the computed shifts across time. We assume that the shifts should never increase over time and that the jump between consecutive shifts shouldn’t be greater than 1
- Parameters:
x – computed shifts
n – condition to apply
- Returns:
- qc_shifts(shifts, shifts_filt)[source]
Compute qc values for the wheel alignment. We consider 4 things 1. The number of camera ttl values that are missing (when we have less ttls than video frames) 2. The number of shifts that have nan values, this means the video motion energy computation 3. The number of large jumps (>10) between the computed shifts 4. The number of jumps (>1) between the shifts after they have been cleaned
- Parameters:
shifts – np.array of shifts over session
shifts_filt – np.array of shifts after being cleaned over session
- Returns:
- extract_times(shifts_filt, t_shifts)[source]
Extracts new camera times after applying the computed shifts across the session
- Parameters:
shifts_filt – filtered shifts computed across session
t_shifts – time point of computed shifts
- Returns:
- static single_cluster_raster(spike_times, events, trial_idx, dividers, colors, labels, weights=None, fr=True, norm=False, axs=None)[source]
Compute and plot trial aligned spike rasters and psth
- Parameters:
spike_times – times of variable
events – trial times to align to
trial_idx – trial idx to sort by
dividers
colors
labels
weights
fr
norm
axs
- Returns:
- plot_with_behavior()[source]
Makes a summary figure of the alignment when behaviour data is available :return:
- plot_without_behavior()[source]
Makes a summary figure of the alignment when behaviour data is not available :return:
- process()[source]
Main function used to apply the video motion wheel alignment to the camera times. This function does the following 1. Computes the video motion energy across the whole session (computed in windows and parallelised) 2. Computes the shift that should be applied to the camera times across the whole session by computing
the cross correlation between the video motion energy and the wheel speed (computed in overlapping windows and parallelised)
Removes artefacts from the computed shifts
Computes the qc for the wheel alignment
Extracts the new camera times using the shifts computed from the video wheel alignment
If upload is True, creates a summary plot of the alignment and uploads the figure to the relevant session
on alyx
- Returns: