Python modules

Rationale

This is an attempt to make analysis of wheelchair biomechanics data more accessible and transparent. Previously all analyses were performed with commercial software that is not available to everyone, especially to people not associated with a university. Having the analysis in Python makes it accessible and more readable (hopefully) for everyone. By sharing the code I hope to be transparent and to reduce the amount of times this code has to be written by other people.

Examples & audience

People working in our lab that want to work with data from any of our instruments. It can, of course, also be used by other people, provided that you have similar equipment. Most of the time you will only need one or two functions which you can just take from the source code or you can just install the package as it has very little overhead anyways and only uses packages that you probably already have installed. Also have a look at the examples.

Installation

Option 1: the package is now on pip:

pip install worklab

Option 2: download the package from this page, and run:

python setup.py install

Option 3: don’t install it and just include the scripts in your working directory (why though?).

To verify if everything works simply try to import worklab:

python
import worklab as wl

That’s it.

Breakdown

  • com: Provides functions for reading and writing data, use load to infer filetype and automatically read it. If you use a different naming scheme you can always call the specific load functions.

  • kinetics: Contains all essentials for measurement wheel and ergometer data. You only need the top-level function auto_process for most use-cases.

  • move: Contains kinematics and movement related functions for NGIMU and some functions for 3D kinematics.

  • physio: Contains physiological calculations, which for now is basically nothing as the spirometer does everything for you. Might include EMG and the likes later though.

  • plots: Contains some basic plotting functionalities for plots that become repetitive, needs some TLC to become really useful.

  • utils: Contains all functions that are useful for more than one application (e.g. filtering and interpolation).

The return of a function is a Pandas DataFrame in 9/10 cases. This means that you can also use all Pandas goodness.

Communication (.com)

Contains functions for reading data from any worklab device. If you abide by regular naming conventions you will only need the load function which will infer the correct function for you. You can also use device-specific load functions if needed.

load

worklab.com.load(filename='')[source]

Attempt to load a common data format.

Most important function in the module. Provides high level loading function to load common data formats. If no filename is given will try to load filename using a file dialog. Will try to infer data source from filename. Try to use a specific load function if load cannot infer the datatype.

Parameters

filename (str) – name or path to file of interest

Returns

data – raw data, format depends on source, but is usually a dict or pandas DataFrame

Return type

pd.DataFrame

load_bike

worklab.com.load_bike(filename)[source]

Load bicycle ergometer data from LEM datafile.

Loads bicycle ergometer data from LEM to a pandas DataFrame containing time, load, rpm, and heart rate (HR).

Parameters

filename (str) – full file path or file in existing path from LEM Excel sheet (.xls)

Returns

data – DataFrame with time, load, rpm, and HR data

Return type

pd.DataFrame

load_esseda

worklab.com.load_esseda(filename)[source]

Loads HSB ergometer data from LEM datafile.

Loads ergometer data measured with LEM and returns the data in a dictionary for the left and right module with a DataFrame each that contains time, force (on wheel), and speed.

Parameters

filename (str) – full file path or file in existing path from LEM Excel sheet (.xls)

Returns

data – dictionary with DataFrame for left and right module

Return type

dict

See also

load_wheelchair()

Load wheelchair information from LEM datafile.

load_spline()

Load calibration splines from LEM datafile.

load_wheelchair

worklab.com.load_wheelchair(filename)[source]

Loads wheelchair from LEM datafile.

Loads the wheelchair data from a LEM datafile. Note that LEM only recently added this to their exports. Returns:

Column

Data

Unit

name

chair name

rimsize

radius of handrim

m

wheelsize

radius of the wheel

m

weight

weight of the chair

kg

Parameters

filename (str) – full file path or file in existing path from LEM Excel sheet (.xls)

Returns

wheelchair – dictionary with wheelchair information

Return type

dict

See also

load_esseda()

Load HSB data from LEM datafile.

load_spline()

Load calibration splines from LEM datafile.

load_hsb

worklab.com.load_hsb(filename)[source]

Loads HSB ergometer data from HSB datafile.

Loads ergometer data measured with the HSBlogger2 and returns the data in a dictionary for the left and right module with a DataFrame each that contains time, force, and speed. HSB files are generally only for troubleshooting and testing that is beyond the scope of LEM.

Parameters

filename (str) – full file path or file in existing path from HSB .csv file

Returns

data – dictionary with DataFrame for left and right module

Return type

dict

load_n3d

worklab.com.load_n3d(filename, verbose=True)[source]

Reads NDI-Optotrak data files

Parameters
  • filename (str) – Optotrak data file (.n3d)

  • verbose (bool) – Print some information about the data from the file. If True (default) it prints the information.

Returns

optodata – Multidimensional numpy array with marker positions (in m) in sample x xyz x marker dimensions.

Return type

ndarray

load_opti

worklab.com.load_opti(filename, rotate=True)[source]

Loads Optipush data from .data file.

Loads Optipush data to a pandas DataFrame, converts angle to radians, and flips torque (Tz). Returns a DataFrame with:

Column

Data

Unit

time

sample time

s

fx

force on local x-axis

N

fy

force on local y-axis

N

fz

force on local z-axis

N

mx

torque around x-axis

Nm

my

torque around y-axis

Nm

torque

torque around z-axis

Nm

angle

unwrapped wheel angle

rad

Note

Optipush uses a local coordinate system, option to rotate Fx and Fy available in >1.6

Parameters
  • filename (str) – filename or path to Optipush .data (.csv) file

  • rotate (bool) – whether or not to rotate from a local rotating axis system to a global non-rotating one, default is True

Returns

opti_df – Raw Optipush data in a pandas DataFrame

Return type

pd.DataFrame

See also

load_sw()

Load measurement wheel data from a SMARTwheel

load_optitrack

worklab.com.load_optitrack(filename, include_header=False)[source]

Loads Optitrack marker data.

Parameters
  • filename (str) – full path to filename or filename in current path

  • include_header (bool) – whether or not to include the header in the output default is False

Returns

marker_data – Marker data in dictionary, metadata in dictionary

Return type

dict

load_imu

worklab.com.load_imu(root_dir, filenames=None)[source]

Imports NGIMU session in nested dictionary with all devices and sensors.

Import NGIMU session in nested dictionary with all devices with all sensors. Translated from xio-Technologies 1.

Parameters
  • root_dir (str) – directory where session is located

  • filenames (list, optional) – list of sensor names or single sensor name that you would like to include, only loads sensor if not specified

Returns

session_data – returns nested object sensordata[device][sensor][dataframe]

Return type

dict

References

1

https://github.com/xioTechnologies/NGIMU-MATLAB-Import-Logged-Data-Example

load_spiro

worklab.com.load_spiro(filename)[source]

Loads COSMED spirometer data from Excel file.

Loads spirometer data to a pandas DataFrame, converts time to seconds (not datetime), computes energy expenditure, computes weights from the time difference between samples, if no heart rate data is available it fills the column with np.NaNs. Returns a DataFrame with:

Column

Data | Unit

time

time at breath

s

HR

heart rate

bpm

EE

energy expenditure

J/s

RER

exchange ratio

VCO2/VO2

VO2

oxygen

l/min

VCO2

carbon dioxide

l/min

VE

ventilation

l/min

VE/VO2

ratio VE/VO2| -

VE/VCO2

ratio VE/VCO2

O2pulse

oxygen pulse (VO2/HR)

PetO2

end expiratory O2 tension

mmHg

PetCO2

end expiratory CO2 tension

mmHg

VT

tidal volume

l

weights

sample weight

Parameters

filename (str) – full file path or file in existing path from COSMED spirometer

Returns

data – Spirometer data in pandas DataFrame

Return type

pd.DataFrame

load_spline

worklab.com.load_spline(filename)[source]

Load wheelchair ergometer calibration spline from LEM datafile.

Loads Esseda calibration spline from LEM which includes all forces (at the roller) at the different calibration points (1:10:1 km/h).

Parameters

filename (object) – full file path or file in existing path from LEM excel file

Returns

data – left and right calibration values

Return type

dict

load_sw

worklab.com.load_sw(filename, sfreq=200)[source]

Loads SMARTwheel data from .txt file.

Loads SMARTwheel data to a pandas DataFrame, converts angle to radians and unwraps it. Returns a DataFrame with:

Column

Data

Unit

time

sample time

s

fx

force on global x-axis

N

fy

force on global y-axis

N

fz

force on global z-axis

N

mx

torque around x-axis

Nm

my

torque around y-axis

Nm

torque

torque around z-axis

Nm

angle

unwrapped wheel angle

rad

Note

SMARTwheel uses a global coordinate system

Parameters
  • filename (str) – filename or path to SMARTwheel .data (.csv) file

  • sfreq (int) – sample frequency of SMARTwheel, default is 200Hz

Returns

sw_df – Raw SMARTwheel data in a pandas DataFrame

Return type

pd.DataFrame

See also

load_opti()

Load measurement wheel data from an Optipush wheel.

Kinetics (.kin)

Contains functions for working with measurement wheel (Optipush and SMARTwheel) and ergometer (Esseda) data You will usually only need the top-level function auto_process.

auto_process

worklab.kin.auto_process(data, wheelsize=0.31, rimsize=0.27, sfreq=200, co_f=15, ord_f=2, co_s=6, ord_s=2, force=True, speed=True, variable='torque', cutoff=0.0, wl=201, ord_a=2, minpeak=5.0)[source]

Top level processing function that performs all processing steps for mw/ergo data.

Contains all signal processing steps in fixed order. It is advised to use this function for all (pre-)processing. If needed take a look at a specific function to see how it works.

Parameters
  • data (pd.DataFrame, dict) – raw ergometer or measurement wheel data

  • wheelsize (float) – wheel radius [m]

  • rimsize (float) – handrim radius [m]

  • sfreq (int) – sample frequency [Hz]

  • co_f (int) – cutoff frequency force filter [Hz]

  • ord_f (int) – order force filter [..]

  • co_s (int) – cutoff frequency force filter [Hz]

  • ord_s (int) – order speed filter [..]

  • force (bool) – force filter toggle, default is True

  • speed (bool) – speed filter toggle, default is True

  • variable (str) – variable name used for peak (push) detection

  • cutoff (float) – noise level for peak (push) detection

  • wl (float) – window length angle filter

  • ord_a (int) – order angle filter [..]

  • minpeak (float) – min peak height for peak (push) detection

Returns

  • data (pd.DataFrame, dict)

  • pushes (pd.DataFrame, dict)

filter_mw

worklab.kin.filter_mw(data, sfreq=200.0, co_f=15.0, ord_f=2, wl=201, ord_a=2, force=True, speed=True)[source]

Filters measurement wheel data.

Filters raw measurement wheel data. Should be used before further processing.

Parameters
  • data (pd.DataFrame) – raw measurement wheel data

  • sfreq (float) – sample frequency [Hz]

  • co_f (float) – cutoff frequency force filter [Hz]

  • ord_f (int) – order force filter [..]

  • wl (float) – window length angle filter

  • ord_a (int) – order angle filter [..]

  • force (bool) – force filter toggle, default is True

  • speed (bool) – speed filter toggle, default is True

Returns

  • data (pd.DataFrame)

  • Same data but filtered.

filter_ergo

worklab.kin.filter_ergo(data, co_f=15.0, ord_f=2, co_s=6.0, ord_s=2, force=True, speed=True)[source]

Filters ergometer data.

Filters raw ergometer data. Should be used before further processing.

Parameters
  • data (dict) – raw measurement wheel data

  • co_f (float) – cutoff frequency force filter [Hz]

  • ord_f (int) – order force filter [..]

  • co_s (float) – cutoff frequency speed filter [Hz]

  • ord_s (int) – order speed filter [..]

  • force (bool) – force filter toggle, default is True

  • speed (bool) – speed filter toggle, default is True

Returns

data – Same data but filtered.

Return type

dict

process_mw

worklab.kin.process_mw(data, wheelsize=0.31, rimsize=0.275, sfreq=200)[source]

Basic processing for measurement wheel data.

Basic processing for measurement wheel data (e.g. speed to distance). Should be performed after filtering. Added columns:

Column

Data

Unit

aspeed

angular velocity

rad/s

speed

velocity

m/s

dist

cumulative distance

m

acc

acceleration

m/s^2

ftot

total combined force

N

uforce

effective force

N

force

force on wheel

N

power

power

W

work

instantanious work

J

Parameters
  • data (pd.DataFrame) – raw measurement wheel data

  • wheelsize (float) – wheel radius [m]

  • rimsize (float) – handrim radius [m]

  • sfreq (int) – sample frequency [Hz]

Returns

data

Return type

pd.DataFrame

process_ergo

worklab.kin.process_ergo(data, wheelsize=0.31, rimsize=0.275)[source]

Basic processing for ergometer data.

Basic processing for ergometer data (e.g. speed to distance). Should be performed after filtering. Added columns:

Column

Data

Unit

angle

angle

rad

aspeed

angular velocity

rad/s

acc

acceleration

m/s^2

dist

cumulative distance

m

power

power

W

work

instantanious work

J

uforce

effective force

N

torque

torque around wheel

Nm

Note

the force column contains force on the wheels, uforce (user force) is force on the handrim

Parameters
  • data (dict) – raw ergometer data

  • wheelsize (float) – wheel radius [m]

  • rimsize (float) – handrim radius [m]

Returns

data

Return type

dict

push_by_push_mw

worklab.kin.push_by_push_mw(data, variable='torque', cutoff=0.0, minpeak=5.0, mindist=5, verbose=True)[source]

Push-by-push analysis for measurement wheel data.

Push detection and push-by-push analysis for measurement wheel data. Returns a pandas DataFrame with:

Column

Data

Unit

start/stop/peak

respective indices

tstart/tstop/tpeak

respective samples

s

cangle

contact angle

rad

cangle_deg

contact angle

degrees

mean/maxpower

power per push

W

mean/maxtorque

torque per push

Nm

mean/maxforce

force per push

N

mean/maxuforce

(rim) force per push

N

mean/maxfeff

feffective per push

%

mean/maxftot

ftotal per push

N

work

work per push

J

cwork

work per cycle

J

negwork

negative work/cycle

J

slope

slope onset to peak

Nm/s

smoothness

mean/peak force

ptime

push time

s

ctime

cycle time

s

reltime

relative push/cycle

%

Parameters
  • data (pd.DataFrame) – measurement wheel DataFrame

  • variable (str) – variable name used for peak (push) detection

  • cutoff (float) – noise level for peak (push) detection

  • minpeak (float) – min peak height for peak (push) detection

Returns

pbp – push-by-push DataFrame

Return type

pd.DataFrame

push_by_push_ergo

worklab.kin.push_by_push_ergo(data, variable='power', cutoff=0.0, minpeak=50.0, mindist=5, verbose=True)[source]

Push-by-push analysis for wheelchair ergometer data.

Push detection and push-by-push analysis for ergometer data. Returns a pandas DataFrame with:

Column

Data

Unit

start/stop/peak

respective indices

tstart/tstop/tpeak

respective samples

s

cangle

contact angle

rad

cangle_deg

contact angle

degrees

mean/maxpower

power per push

W

mean/maxtorque

torque per push

Nm

mean/maxforce

force per push

N

mean/maxuforce

(rim) force per push

N

work

work per push

J

cwork

work per cycle

J

negwork

negative work/cycle

J

slope

slope onset to peak

Nm/s

smoothness

mean/peak force

ptime

push time

s

ctime

cycle time

s

reltime

relative push/cycle

%

Parameters
  • data (dict) – wheelchair ergometer dictionary

  • variable (str) – variable name used for peak (push) detection, default = power

  • cutoff (float) – noise level for peak (push) detection, default = 0

  • minpeak (float) – min peak height for peak (push) detection, default = 50.0

  • mindist (int) – minimum sample distance between peak candidates, can be used to speed up algorithm

Returns

pbp – dictionary with left, right and mean push-by-push DataFrame

Return type

dict

Kinematics (.move)

Basic functions for movement related data from optical tracking systems. If I have the time I will make a vector3d class. Most functions assume an [n, 3] or [1, 3] array or dataframe.

get_perp_vector

worklab.move.get_perp_vector(vector2d, clockwise=True, normalized=True)[source]

Get the vector perpendicular to the input vector. Only works in 2D as 3D has infinite solutions.

Parameters
  • vector2d (np.array) – [n, 3] vector data, only uses x and y

  • clockwise (bool) – clockwise or counterclockwise rotation

  • normalized (bool) – whether or not to normalize the result, default is True

Returns

perp_vector2d – rotated vector

Return type

np.array

get_rotation_matrix

worklab.move.get_rotation_matrix(new_frame, local_to_world=True)[source]

Get the rotation matrix between a new reference frame and the global reference frame or the other way around.

Parameters
  • new_frame (np.array) – [3, 3] array specifying the new reference frame

  • local_to_world (bool) – global to local or local to global

Returns

rotation_matrix – rotation matrix that can be used to rotate marker data, e.g.: rotation_matrix @ marker

Return type

np.array

get_orthonormal_frame

worklab.move.get_orthonormal_frame(point1, point2, point3, mean=False)[source]

Returns an orthonormal frame from three reference points. For example, a local coordinate system from three marker points.

Parameters
  • point1 (np.array) – first marker point, used as origin if mean=False

  • point2 (np.array) – second marker point, used as x-axis

  • point3 (np.array) – third marker point

  • mean (bool) – whether or not the mean should be used as origin, default is False

Returns

  • origin (np.array) – xyz column vector with coordinates of the origin which is point1 or the mean of all points

  • orthonormal (np.array) – 3x3 array with orthonormal coordinates [x, y, z] of the new axis system

mirror

worklab.move.mirror(vector3d, axis='xyz')[source]

Simply mirror one or multiple axes.

Parameters
  • vector3d (np.array) – vector to be mirrored, also works on dataframes

  • axis (str) – string with axes to be mirrored

Returns

vector3d – mirrored vector

Return type

np.array

rotate

worklab.move.rotate(vector3d, angle, deg=False, axis='z')[source]

Rotate a vector around a single given axis, specify rotation angle in radians or degrees.

Parameters
  • vector3d (np.array) – vector to be rotated, also works on dataframes, assumes [n, xyz] data

  • angle (float) – angle to rotate over

  • deg (bool) – True if angle is specified in degrees, False for radians

  • axis (str) – axis to rotate over, default = “z”

Returns

vector3d – rotated vector

Return type

np.array

scale

worklab.move.scale(vector3d, x=1.0, y=1.0, z=1.0)[source]

Scale a vector in different directions.

Parameters
  • vector3d (np.array) – array to be scaled, also works on dataframes, assumes [n, xyz] data

  • x (float) – x-axis scaling

  • y (float) – y-axis scaling

  • z (floag) – z-axis scaling

Returns

vector3d – scaled array

Return type

np.array

magnitude

worklab.move.magnitude(vector3d)[source]

Calculates the vector magnitude using an l2 norm. Works with [1, 3] or [n, 3] vectors.

Parameters

vector3d (np.array) – a [1, 3] or [n, 3] vector

Returns

vector3d – scalar value or column vector

Return type

np.array

normalize

worklab.move.normalize(vector3d)[source]

Normalizes [n, 3] marker data using an l2 norm. Works with [1, 3] and [n, 3] vectors, both arrays and dataframes.

Parameters

vector3d (np.array) – marker data to be normalized

Returns

vector3d – normalized marker data

Return type

np.array

distance

worklab.move.distance(point1, point2)[source]

Compute Euclidean distance between two points, this is the distance if you were to draw a straight line.

Parameters
  • point1 (np.array) – a [1, 3] or [n, 3] array with point coordinates

  • point2 (np.array) – a [1, 3] or [n, 3] array with point coordinates

Returns

distance – distance from point1 to point2 in a [1, 3] or [n, 3] array

Return type

np.array

marker_angles

worklab.move.marker_angles(v_1, v_2, deg=False)[source]

Calculates n angles between two [n, 3] markers, two [1, 3] markers, or one [n, 3] and one [1, 3] marker.

Parameters
  • v_1 (np.array) – [n, 3] array or DataFrame for marker 1

  • v_2 (np.array) – [n, 3] array or DataFrame for marker 2

  • deg (bool) – return radians or degrees, default is radians

Returns

x – returns [n, 1] array with the angle for each sample or scalar value

Return type

np.array

is_unit_length

worklab.move.is_unit_length(vector3d, atol=1e-08)[source]

Checks whether an array ([1, 3] or [n, 3]) is equal to unit length given a tolerance

IMU (.imu)

Basic functions for movement related data from IMUs. IMU functions are specifically made for the NGIMUs we use in the worklab.

resample_imu

worklab.imu.resample_imu(sessiondata, sfreq=400.0)[source]

Resample all devices and sensors to new sample frequency.

Resamples all devices and sensors to new sample frequency. Sample intervals are not fixed with NGIMU’s so resampling before further analysis is recommended. Translated from xio-Technologies 2.

Parameters
  • sessiondata (dict) – original session data structure to be resampled

  • sfreq (float) – new intended sample frequency

Returns

sessiondata – resampled session data structure

Return type

dict

References

2

https://github.com/xioTechnologies/NGIMU-MATLAB-Import-Logged-Data-Example

calc_wheelspeed

worklab.imu.process_imu(sessiondata, camber=15, wsize=0.31, wbase=0.6, inplace=False)[source]

Calculate wheelchair kinematic variables based on NGIMU data

Parameters
  • sessiondata (dict) – original sessiondata structure

  • camber (float) – camber angle in degrees

  • wsize (float) – radius of the wheels

  • wbase (float) – width of wheelbase

  • inplace (bool) – performs operation inplace

Returns

sessiondata – sessiondata structure with processed data

Return type

dict

change_imu_orientation

worklab.imu.change_imu_orientation(sessiondata, inplace=False)[source]

Changes IMU orientation from in-wheel to on-wheel

Parameters
  • sessiondata (dict) – original sessiondata structure

  • inplace (bool) – perform operation inplace

Returns

sessiondata – sessiondata with reoriented gyroscope data

Return type

dict

push_imu

worklab.imu.push_imu(acceleration, sfreq=400.0)[source]

Push detection based on velocity signal of IMU on a wheelchair 3.

Parameters
  • acceleration (np.array, pd.Series) – acceleration data structure

  • sfreq (float) – sampling frequency

Returns

Return type

push_idx, acc_filt, n_pushes, cycle_time, push_freq

References

3

van der Slikke, R., Berger, M., Bregman, D., & Veeger, D. (2016). Push characteristics in wheelchair court sport sprinting. Procedia engineering, 147, 730-734.

butterfly

spider

sprint_10m

sprint_20m

vel_zones

worklab.imu.vel_zones(velocity, time)[source]

Calculate wheelchair velocity zones

Parameters
  • velocity (np.array, pd.Series) – velocity data structure

  • time (np.array, pd.Series) – time data structure

Returns

velocity_zones – velocity zones (m/s), 1-2, 2-3, 3-4, 4-5, 5 and above

Return type

dict

Physiology (.physio)

Basics for working with physiological data. We only have a spirometer in the lab at the moment and this involves very little processing. Might expand with EMG related function at some point in the future.

get_spirometer_units

Plotting (.plots)

Most variables can easily be plotted with matplotlib or pandas as most data in this package is contained in dataframes. Some plotting is tedious however and these are functions for those plots.

plot_pushes

worklab.plots.plot_pushes(data, pushes, var='torque', start=True, stop=True, peak=True, ax=None)[source]

Plot pushes from measurement wheel or ergometer data.

Parameters
  • data (pd.DataFrame) –

  • pushes (pd.DataFrame) –

  • var (str) – variable to plot, default is torque

  • start (bool) – plot push starts, default is True

  • stop (bool) – plot push stops, default is True

  • peak (bool) – plot push peaks, default is True

  • ax (axis object) – Axis to plot on, you can add your own or it will make a new one.

Returns

ax

Return type

axis object

plot_pushes_ergo

worklab.plots.plot_pushes_ergo(data, pushes, title=None, var='power', start=True, stop=True, peak=True)[source]

Plot left, right and mean side ergometer push data

Parameters
  • data (dict) – processed ergometer data dictionary with dataframes

  • pushes (dict) – processed push_by_push ergometer data dictionary with dataframes

  • title (str) – title of the plot, optional

  • var (str) – variable to plot, default is power

  • start (bool) – plot push starts, default is True

  • stop (bool) – plot push stops, default is True

  • peak (bool) – plot push peaks, default is True

Returns

axes – an array containing an axis for the left, right and mean side

Return type

np.array

plot_power_speed_dist

worklab.plots.plot_power_speed_dist(data, title='', ylim_power=None, ylim_speed=None, ylim_distance=None)[source]

Plot power, speed and distance versus time for left (solid line) and right (dotted line) seperately

Figure scales automatically, unless you specify it manually with the ylim_* arguments

Parameters
  • data (dict) – processed ergometer data dictionary with dataframes

  • title (str) – a title for the plot

  • ylim_power (list [min, max] of float or int, optional) – list of the minimal and maximal ylim for power in W

  • ylim_speed (list [min, max] of floats or int, optional) – list of the minimal and maximal ylim for speed in km/h

  • ylim_distance (list [min, max] of floats or int, optional) – list of the minimal and maximal ylim for distance in m

Returns

  • fig (matplotlib.figure.Figure)

  • axes (tuple) – the three axes objects

acc_peak_dist_plot

worklab.plots.acc_peak_dist_plot(time, acc, dist, name='')[source]

Plot acceleration and distance versus time, with acc_peak

Parameters
  • time (np.array, pd.Series) – time structure

  • acc (np.array, pd.Series) – acceleration structure

  • dist (np.array, pd.Series) – distance structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

acc_peak_plot

worklab.plots.acc_peak_plot(time, acc, name='')[source]

Plot acceleration versus time, with acc_peak

Parameters
  • time (np.array, pd.Series) – time structure

  • acc (np.array, pd.Series) – acceleration structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

acc_plot

worklab.plots.acc_plot(time, acc, name='')[source]

Plot acceleration versus time

Parameters
  • time (np.array, pd.Series) – time structure

  • acc (np.array, pd.Series) – acceleration structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

imu_push_plot

worklab.plots.imu_push_plot(time, vel, acc_raw, name='')[source]

Plot push detection with IMUs

Parameters
  • time (dict) – time structure

  • vel (dict) – velocity structure

  • acc_raw (dict) – raw acceleration structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

rot_vel_plot

worklab.plots.rot_vel_plot(time, rot_vel, name='')[source]

Plot rotational velocity versus time

Parameters
  • time (np.array, pd.Series) – time structure

  • rot_vel (np.array, pd.Series) – rotational velocity structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

set_axes_equal_3d

worklab.plots.set_axes_equal_3d(axes)[source]

Set 3D plot axes to equal scale and size

Parameters

axes (matplotlib.axes._subplots.Axes3DSubplot) – axes containing 3D plotted data

vel_peak_dist_plot

worklab.plots.vel_peak_dist_plot(time, vel, dist, name='')[source]

Plot velocity and distance against time

Parameters
  • time (np.array, pd.Series) – time structure

  • vel (np.array, pd.Series) – velocity structure

  • dist (np.array, pd.Series) – distance structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

vel_peak_plot

worklab.plots.vel_peak_plot(time, vel, name='')[source]

Plot velocity versus time, with vel_peak

Parameters
  • time (np.array, pd.Series) – time structure

  • vel (np.array, pd.Series) – velocity structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

vel_plot

worklab.plots.vel_plot(time, vel, name='')[source]

Plot velocity versus time

Parameters
  • time (np.array, pd.Series) – time structure

  • vel (np.array, pd.Series) – velocity structure

  • name (str) – name of a session

Returns

ax

Return type

axis object

Utilities (.utils)

This module contains utility functions used by all modules or functions that have multiple applications such as filtering, finding zero-crossings, finding the nearest value in a signal.

pick_file

worklab.utils.pick_file(initialdir=None)[source]

Open a window to select a single file

Parameters

initialdir (str) – directory to start from

Returns

filename – full path to picked file

Return type

str

pick_files

worklab.utils.pick_files(initialdir=None)[source]

Open a window to select multiple files

Parameters

initialdir (str) – directory to start from

Returns

filename – full path to picked file

Return type

list

pick_directory

worklab.utils.pick_directory(initialdir=None)[source]

Open a window to select a directory

Parameters

initialdir (str) – directory to start from

Returns

directory – full path to selected directory

Return type

str

pick_save_file

worklab.utils.pick_save_file(initialdir=None)[source]

Open a window to select a savefile

Parameters

initialdir (str) – directory to start from

Returns

directory – full path to selected savefile

Return type

str

calc_weighted_average

make_calibration_spline

worklab.utils.make_calibration_spline(calibration_points)[source]

Makes a pre-1.0.4 calibration spline for the Esseda wheelchair ergometer.

Parameters

calibration_points (dict) – dict with left: np.array, right: np.array

Returns

spl_line – dict with left: np.array, right: np.array containing the interpolated splines

Return type

dict

make_linear_calibration_spline

worklab.utils.make_linear_calibration_spline(calibration_points)[source]

Makes a post-1.0.4 calibration spline for the Esseda wheelchair ergometer.

Parameters

calibration_points (dict) – dict with left: np.array, right: np.array

Returns

spl_line – dict with left: np.array, right: np.array containing the interpolated splines

Return type

dict

pd_dt_to_s

worklab.utils.pd_dt_to_s(dt)[source]

Calculates time in seconds from datetime or string.

Parameters

dt (pd.Series) – datetime instance or a string with H:m:s data

Returns

time – time in seconds

Return type

pd.Series

lowpass_butter

worklab.utils.lowpass_butter(array, sfreq=100.0, cutoff=20.0, order=2)[source]

Apply a simple zero-phase low-pass Butterworth filter on an array.

Parameters
  • array (np.array) – input array to be filtered

  • sfreq (float) – sample frequency of the signal, default is 100

  • cutoff (float) – cutoff frequency for the filter, default is 20

  • order (int) – order of the filter, default is 2

Returns

array – filtered array

Return type

np.array

interpolate_array

worklab.utils.interpolate_array(x, y, kind='linear', fill_value='extrapolate', assume=True)[source]

Simple function to interpolate an array with Scipy’s interp1d. Also extrapolates NaNs.

Parameters
  • x (np.array) – time array (without NaNs)

  • y (np.array) – array with potential NaNs

  • kind (str) – kind of filter, default is “linear”

  • fill_value (str) – fill value, default is “extrapolate”

  • assume (bool) – assume that the array is sorted (performance), default is True

Returns

y – interpolated y-array

Return type

np.array

pd_interp

worklab.utils.pd_interp(df, interp_column, at)[source]

Resamples (and extrapolates) DataFrame with Scipy’s interp1d, this was more performant than the pandas one for some reason.

Parameters
  • df (pd.DataFrame) – target DataFrame

  • interp_column (str) – column to interpolate on, e.g. “time”

  • at (np.array) – column to interpolate to

Returns

interp_df – interpolated DataFrame

Return type

pd.DataFrame

merge_chars

worklab.utils.merge_chars(chars)[source]

Merges list or tuple of binary characters to single string

Parameters

chars (list, tuple) – list or tuple of binary characters

Returns

concatenated characters

Return type

str

find_peaks

worklab.utils.find_peaks(data, cutoff=1.0, minpeak=5.0, min_dist=5)[source]

Finds positive peaks in signal and returns indices of start and stop.

Parameters
  • data (pd.Series, np.array) – any signal that contains peaks above minpeak that dip below cutoff

  • cutoff (float) – where the peak gets cut off at the bottom, basically a hysteresis band

  • minpeak (float) – minimum peak height of wave

  • min_dist (int) – minimum sample distance between peak candidates, can be used to speed up algorithm

Returns

peaks – dictionary with start, end, and peak index of each peak

Return type

dict

coast_down_velocity

worklab.utils.coast_down_velocity(t, v0, c1, c2, m)[source]

Solution for the non-linear differential equation M(dv/dt) + c1*v**2 + c2 = 0. Returns the instantaneous velocity decreasing with time (t) for the friction coefficients c1 and c2 for an object with a fixed mass (M)

Parameters
  • t (np.array) –

  • v0 (float) –

  • c1 (float) –

  • c2 (float) –

  • m (float) –

Returns

Return type

np.array

nonlinear_fit_coast_down

worklab.utils.nonlinear_fit_coast_down(time, vel, total_weight)[source]

Performs a nonlinear fit on coast-down data, returning c1 and c2.

Parameters
  • time (np.array) –

  • vel (np.array) –

  • total_weight (float) –

Returns

c1, c2

Return type

tuple

mask_from_iterable

worklab.utils.mask_from_iterable(array, floor_values, ceil_values)[source]

Combines multiple masks from iterable into one mask (e.g. can be used to select multiple time slices).

Parameters
  • array (np.array) – array to apply mask on

  • floor_values (list) – minimum values in array

  • ceil_values (list) – maximum values in array

Returns

mask

Return type

np.array

calc_inertia

worklab.utils.calc_inertia(weight=0.8, radius=0.295, length=0.675, period=1.0)[source]

Calculate the inertia of an object based on the trifilar pendulum equation.

Parameters
  • weight (float) – total mass of the object, default is 0.8

  • radius (float) – radius of the object, default is 0.295

  • length (float) – length of the trifilar pendulum

  • period (float) – observed oscillation period

Returns

inertia – inertia [kgm2]

Return type

float

zerocross1d

worklab.utils.zerocross1d(x, y, indices=False)[source]

Find the zero crossing points in 1d data.

Find the zero crossing events in a discrete data set. Linear interpolation is used to determine the actual locations of the zero crossing between two data points showing a change in sign. Data point which are zero are counted in as zero crossings if a sign change occurs across them. Note that the first and last data point will not be considered whether or not they are zero.

Parameters
  • x (np.array, pd.Series) – time/sample variable

  • y (np.array, pd.Series) – y variable

  • indices (bool) – return indices or not, default is False

Returns

position in time and optionally the index of the sample before the zero-crossing

Return type

np.array

camel_to_snake

worklab.utils.camel_to_snake(name: str)[source]

Turns CamelCased text into snake_cased text.

Parameters

name (str) – StringToConvert

Returns

converted_string

Return type

str

find_nearest

worklab.utils.find_nearest(array, value, index=False)[source]

Find the nearest value in an array or the index thereof.

Parameters
  • array (np.array) – array which has to be searched

  • value (float) – value that you are looking for

  • index (bool) – whether or not you want the index

Returns

value or index of nearest value

Return type

np.array

binned_stats

worklab.utils.binned_stats(array, bins=10, pad=True, func=<function mean>, nan_func=<function nanmean>)[source]

Apply a compatible Numpy function to every bins samples (e.g. mean or std).

Parameters
  • array (np.array) – array which has to be searched

  • bins (int) – number of samples to be averaged

  • pad (bool) – whether or not to pad the array with NaNs if needed

  • func – function that is used when no padding is applied

  • nan_func – function that is used when padding is applied

Returns

means – array with the mean for every bins samples.

Return type

np.array

Timer

class worklab.utils.Timer(name='', text='Elapsed time: {:0.4f} seconds', start=True)[source]

Simple timer for timing code(blocks).

Parameters
  • name (str) – name of timer, gets saved in Timer.timers optional

  • text (str) – custom text, optional

  • start (bool) – automatically start the timer when it’s initialized, default is True

start()[source]

start the timer

stop()[source]

stop the timer, prints and returns the time

lap()[source]

print the time between this lap and the previous one