[go: up one dir, main page]

Skip to content

Evaluates the fiber orientation and intensity around cells that compact collagen tissue as measure of contractile strength.

License

Notifications You must be signed in to change notification settings

davidbhr/CompactionAnalyzer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CompactionAnalyzer

Cells apply contractile forces to their surrounding e.g. during migration, development, wound healing, or in various diseases. To study these processes, cellular forces can be measured using traction force microscopy. However, 3D traction force microscopy can be very laborious (nonlinear FE models, rheology, regularization). In 3D fiber networks, alignment and compaction of fibers is a consequence of cellular forces. The method here quantifies the amount of fiber alignment & fiber density around cells in fiber matrices. These quantities can then be used as an proxy value of contractile force (or for different purposes).

Hepatic stellate cell compacting a collagen type I hydrogel over time.

Quantification of tissue compaction around cells

Python package to quantify the tissue compaction (as a measure of the contractile strength) generated by cells or multicellular spheroids that are embedded in fiber materials. For this we provide the two following approaches:

  • Evaluating the directionality of fibers towards the cell center.
  • Evaluating the increased fiber intensity around the cell.

Installation

Simply install the CompactionAnalyzer via Pip by running following command in the console:

pip install CompactionAnalyzer

Alternativley, you can download a standalone ".exe" executable of the 3D TFM software saenopy here. CompactionAnalyzer is included here, just select the "Orientation" tab.

Alternatively: The package can be installed by cloning this repository or downloading the repository as a zip file here. For installation, run the following command within the unzipped folder, in which the setup.py file is located: pip install -e .. This automatically downloads and installs all other required packages.

Preprint

If you want to read more or to cite CompactionAnalyzer you can refer to our preprint:

Fiber alignment in 3D collagen networks as a biophysical marker for cell contractility
David Böhringer, Andreas Bauer, Ivana Moravec, Lars Bischof, Delf Kah, Christoph Mark, Thomas J Grundy, Ekkehard Goerlach, Geraldine M O'Neill, Silvia Budday, Pamela Strissel, Reiner Strick, Andrea Malandrino, Richard Gerum, Michael Mak, Martin Rausch, Ben Fabry
doi: https://doi.org/10.1101/2023.06.28.546896

Tutorial

The scripts within the turorial folder might be a good start to get familiar with the analyis: The script CompactionAnalysis_cells_collagen.py evaluates 4 example cells that are embedded in collagen and compacted the surrounding collagen. Fiber stucture was recorded using 2nd harmonic imaging and cell outline using calcein staining.

Further scripts CompactionAnalysis_empty_collagen.py & CompactionAnalysis_artificial_data.py evaluate empty collagen gels that show random fiber allignement and artifiacl data with random allignement.

In these script we start to import all necessary functions using

from CompactionAnalyzer.CompactionFunctions import *

For the analysis, we need per cell an image of the fiber structure (e.g. 2nd harmonic, confocal reflection or stained fluorescence images; maximum intensity projection around the cells might be useful) and an image of the cell for segmentation (staining or brightfield).

We define the input data for the fibers using fiber_list_string ant the cells using cell_list_string (here we can utilize the * place holder to selecet multiple images). generate_lists() then searches all specified fiber and cell paths and creates the output subfolder in the specified output_folder directory completley automatically.

output_folder = "C:\user\Analysis_output"                     # output folder that will be created and filled automatically
fiber_list_string =  r"C:\user\imagedata\cell_*\*ch00*.tif"   # input fiber images of all cells 
cell_list_string =  r"C:\user\imagedata\cell_*\*ch01*.tif"    # input stained images of all cells 

fiber_list,cell_list, out_list = generate_lists(fiber_list_string, cell_list_string, output_main =output_folder)

We now want to start the analysis and compute the orientation of individual fibers using structure tensor analysis. Here sigma_tensor is the kernel size that determines the length scale on which the strucutre is analysed. The kernel size should be in the range of the structure-size we want to look at and can be optimized for the individual application. For our fiber gels we use a value of 7 µm, which is in range of the pore size. The script DetermineWindowSize.py in the tutorial folder provides a template to systematically test different windowsizes on the same image pair and from that select the ideal size of the sigma_tensor for this setup (which displays a peak in the orientation).

We can adjust all of the following paramters before starting the analysis. The corresponding pixel scale is set as scale and the segmentiation can be changed by using the segmention_thres or by changing the local contrast enhancement via seg_gaus1, seg_gaus2. With show_segmentation = True we can inspect the segmentation or - if preferred - segment the mask manually by clicking using manual_segmention = True. Further, a maximal distance around the cell center can be specified for the analysis using max_dist.

scale =  0.318                  # imagescale in um per pixel
sigma_tensor = 7/scale          # sigma of applied gauss filter / windowsize for the structure tensor analysis in px
                                # should be in the order of the objects to analyze !! 
                                # 7 um for collagen 
edge = 40                       # Cut off pixels at the edge since values at the border cannot be trusted
segmention_thres = 1.0          # for cell segmentation, thres 1 equals normal otsu threshold , change to detect different percentage of bright pixel
max_dist = None                 # optional: specify the maximal distance around cell center for the analysis (in px)
seg_gaus1, seg_gaus2 = 0.5,100  # 2 gaus filters used as bandpassfilter for local contrast enhancement; For seg_gaus2 = None a single gauss filter is applied
max_dist = None,                # optional: specify the maximal distance around cell center for analysis (in px)
regional_max_correction = True  # background correction using regional maxima approach
show_segmentation = False       # display the segmentation output (script won't run further)
sigma_first_blur  = 0.5         # slight first bluring of whole image before appplying the structure tensor
angle_sections = 5              # size of angle sections in degree 
shell_width =  None             # pixel width of distance shells (px-value=um-value/scale)
manual_segmention = False       # segmentation of mask by manual clicking the cell outline
plotting = True                 # creates and saves individual figures additionally to the excel files 
dpi = 200                       # resolution of figures 
SaveNumpy = False               # saves numpy arrays for later analysis - can create large data files
norm1=1,norm2 = 99              # contrast spreading for input images between norm1- and norm2-percentile; values below norm1-percentile are set to zero and
                                # values above norm2-percentile are set to 1
seg_invert=False                # if segmentation is inverted dark objects are detected inseated of bright objects
seg_iter = 1                    # number of repetitions of  binary closing, dilation and filling holes steps
segmention_method="otsu"        # use "otsu", "entropy" or "yen"  as segmentation method
segmention_min_area = 1000      # small bjects below this px-area are removed during cell segmentation
load_segmentation = False       # if True enter the path of the segementation.npy - file in path_seg
path_seg = None                 # to load a mask

Now we start to analyse all our cells individually using the single function StuctureAnalysisMain (follow the scripts in the tutorial folder):

# Start the structure analysis with the above specified parameters
StuctureAnalysisMain(fiber_list=fiber_list,
                     cell_list=cell_list, 
                     out_list=out_list,
                     scale=scale, 
                     sigma_tensor = sigma_tensor , 
                     ...)

For each cell we now receive 3 excel files:

  • results_total.xlsx - Evaluating the overall orientation within the field of view
  • results_distance.xlsx - Evaluating the orientation & Intensity in distance shells towards the cell surface
  • results_angle.xlsx - Evaluating the orientation & Intensity in angle sections around the cell center

To compare different cells we can utilize e.g the total orientation within a field of view (requires that all cells have the same Field of View) or could also compare the intensity values in the first distance shell(s).

If we want to evaluate a measurement containing multiple cells, we can read in all excel files (of individual cells) in the underlying folders of the given data path and combine them in a new excel file by using:

SummarizeResultsTotal(data="Analysis_output", output_folder= "Analysis_output\Combine_Set1")
SummarizeResultsDistance(data="Analysis_output", output_folder= "Analysis_output\Combine_Set1")

Note: These function searches all subfolders for the "results_total.xlsx" and "results_distance.xlsx" files. If you want to discard outliers it might be practical to rename the corresponding files to for example "_results_total.xlsx" and "_results_distance.xlsx")

We receive a compromised excel sheet that returns the global analysis for all cells and another excel sheet with the mean distance analysis. The different columns Mean Angle and Orientation refer to the angular deviation between all orientation vectors to the respective cell center and the hereby resulting orientation. These quantities are weighted by the coherency (orientation strength) and additionally also by both, the coherency and the image intensity. From the different cells we now can calculate different quantities, as for example the mean Orientation (weighted by intensity and coherency) of all cells, which is named Overall weighted Oriantation (mean all cells) and also stored in the same excel file.

Graphical User Interface (GUI)

For an easy use of the CompactionAnalyzer, we provide a graphical user interface (GUI) that simplifies the execution and evaluation of several experiments. To start the GUI, just run the script GUI.py. Pairs of fiber and cell images can be loaded individually or batchwise by using the *-placeholder.

Parameters can be configurated and the cell-segmentationen viewed and changed individually per cell. Upon Run the analysis is started and results will be stored in specified output folder.

For data analysis, the results_total.xlsx files can be loaded again individually or batchwise by using the *-placeholder. Intensity and Orientation can then be evaluated individually or by adding several cells to user defined groups. Bar and distance plots (mean+-se) are created automatically and individual python-scripts to re-plot the data can be exported.

Resolving Drug Effects & Multicellular Compaction Assay

An application of the CompactionAnalyzer is resolving drug-dependend effects on cell contractility of individual cells or mutlicellular aggregates.

Additionally, absolute forces of spheroids can be measured using the jointforces python package here, which requires additonal material measurements & timelapse imaging. Absolute forces of cell can be assesed using saenopy here, which requires additonal material measurements and two (larger) 3D stacks of the contracted and realaxed state per cell.