alf.io

Generic ALF I/O module. Provides support for time-series reading and interpolation as per the specifications For a full overview of the scope of the format, see: https://ibllib.readthedocs.io/en/develop/04_reference.html#alf

Functions

add_uuid_string

check_dimensions

Test for consistency of dimensions as per ALF specs in a dictionary.

dataframe

Converts an Bunch conforming to size conventions into a pandas Dataframe For 2-D arrays, stops at 10 columns per attribute :return: pandas Dataframe

exists

Test if ALF object and optionally specific attributes exist in the given path :param alfpath: str or pathlib.Path of the folder to look into :param object: str ALF object name :param attributes: list or list of strings for wanted attributes :return: bool.

get_session_path

Returns the session path from any filepath if the date/number pattern is found

is_details_dict

is_session_path

Checks if the syntax corresponds to a session path. Note that there is no physical check

is_uuid

Bool test for randomly generated hexadecimal uuid validity Unlike is_uuid_string, this function accepts UUID objects

is_uuid_string

Bool test for randomly generated hexadecimal uuid validity NB: uuid must be hyphen separated

load_file_content

Returns content of files.

load_object

Reads all files (ie.

read_ts

Load time-series from ALF format t, d = alf.read_ts(filename)

remove_uuid_file

Renames a file without the UUID and returns the new pathlib.Path object

remove_uuid_recursive

Within a folder, recursive renaming of all files to remove UUID

save_metadata

Writes a meta data file matching a current alf file object. For example given an alf file clusters.ccfLocation.ssv this will write a dictionary in json format in clusters.ccfLocation.metadata.json Reserved keywords: - columns: column names for binary tables. - row: row names for binary tables. - unit.

save_object_npy

Saves a dictionary in alf format using object as object name and dictionary keys as attribute names.

Classes

AlfBunch

class AlfBunch(*args, **kwargs)[source]

Bases: brainbox.core.Bunch

property check_dimensions
append(b, inplace=False)[source]

Appends one bunch to another, key by key :param bunch: :return: Bunch

to_df()[source]

Attempts to returns a pandas.DataFrame if all elements are arrays of the same length Returns the original bunch if it can’t

dataframe(adict)[source]

Converts an Bunch conforming to size conventions into a pandas Dataframe For 2-D arrays, stops at 10 columns per attribute :return: pandas Dataframe

check_dimensions(dico)[source]

Test for consistency of dimensions as per ALF specs in a dictionary. Raises a Value Error.

Alf broadcasting rules: only accepts consistent dimensions for a given axis a dimension is consistent with another if it’s empty, 1, or equal to the other arrays dims [a, 1], [1, b] and [a, b] are all consistent, [c, 1] is not

Parameters

dico – dictionary containing data

Returns

status 0 for consistent dimensions, 1 for inconsistent dimensions

read_ts(filename)[source]

Load time-series from ALF format t, d = alf.read_ts(filename)

load_file_content(fil)[source]

Returns content of files. Designed for very generic file formats: so far supported contents are json, npy, csv, tsv, ssv, jsonable

Parameters

fil – file to read

:return:array/json/pandas dataframe depending on format

exists(alfpath, object, attributes=None, **kwargs)[source]

Test if ALF object and optionally specific attributes exist in the given path :param alfpath: str or pathlib.Path of the folder to look into :param object: str ALF object name :param attributes: list or list of strings for wanted attributes :return: bool. For multiple attributes, returns True only if all attributes are found

load_object(alfpath, object=None, short_keys=False, **kwargs)[source]

Reads all files (ie. attributes) sharing the same object. For example, if the file provided to the function is spikes.times, the function will load spikes.time, spikes.clusters, spikes.depths, spike.amps in a dictionary whose keys will be time, clusters, depths, amps Full Reference here: https://docs.internationalbrainlab.org/en/latest/04_reference.html#alf Simplified example: _namespace_object.attribute_timescale.part1.part2.extension

Parameters
  • alfpath – any alf file pertaining to the object OR directory containing files

  • object – if a directory is provided and object is None, all valid ALF files returned

  • short_keys – by default, the output dictionary keys will be compounds of attributes, timescale and any eventual parts separated by a dot. Use True to shorten the keys to the attribute and timescale.

Returns

a dictionary of all attributes pertaining to the object

Examples

# Load spikes object spikes = ibllib.io.alf.load_object(‘/path/to/my/alffolder/’, ‘spikes’)

# Load trials object under the ibl namespace trials = ibllib.io.alf.load_object(session_path, ‘trials’, namespace=’ibl’)

save_object_npy(alfpath, dico, object, parts=None, namespace=None, timescale=None)[source]

Saves a dictionary in alf format using object as object name and dictionary keys as attribute names. Dimensions have to be consistent. Reference here: https://github.com/cortex-lab/ALF Simplified example: _namespace_object.attribute.part1.part2.extension

Parameters
  • alfpath – path of the folder to save data to

  • dico – dictionary to save to npy; keys correspond to ALF attributes

  • object – name of the object to save

  • parts – extra parts to the ALF name

  • namespace – the optional namespace of the object

  • timescale – the optional timescale of the object

Returns

List of written files

example: ibllib.io.alf.save_object_npy(‘/path/to/my/alffolder/’, spikes, ‘spikes’)

save_metadata(file_alf, dico)[source]

Writes a meta data file matching a current alf file object. For example given an alf file clusters.ccfLocation.ssv this will write a dictionary in json format in clusters.ccfLocation.metadata.json Reserved keywords:

  • columns: column names for binary tables.

  • row: row names for binary tables.

  • unit

Parameters
  • file_alf – full path to the alf object

  • dico – dictionary containing meta-data.

Returns

None

remove_uuid_file(file_path, dry=False)[source]

Renames a file without the UUID and returns the new pathlib.Path object

remove_uuid_recursive(folder, dry=False)[source]

Within a folder, recursive renaming of all files to remove UUID

add_uuid_string(file_path, uuid)[source]
is_uuid_string(string: str) → bool[source]

Bool test for randomly generated hexadecimal uuid validity NB: uuid must be hyphen separated

is_uuid(uuid: Union[str, int, bytes, uuid.UUID]) → bool[source]

Bool test for randomly generated hexadecimal uuid validity Unlike is_uuid_string, this function accepts UUID objects

get_session_path(path: Union[str, pathlib.Path]) → pathlib.Path[source]

Returns the session path from any filepath if the date/number pattern is found

is_session_path(path_object)[source]
Checks if the syntax corresponds to a session path. Note that there is no physical check

about existence nor contents

Parameters

path_object

Returns

is_details_dict(dict_obj)[source]