Home

Awesome

CompactionAnalyzer

Cells apply contractile forces to their surrounding e.g. during migration, development, wound healing, or in various diseases. To study these processes, cellular forces can be measured using traction force microscopy. However, 3D traction force microscopy can be very laborious (nonlinear FE models, rheology, regularization). In 3D fiber networks, alignment and compaction of fibers is a consequence of cellular forces. The method here quantifies the amount of fiber alignment & fiber density around cells in fiber matrices. These quantities can then be used as an proxy value of contractile force and to resolve drug-induced changes in the matrix fiber arrangmet (or for different purposes)

<p align="center"> <img src="../master//docs/images/TimelapseHSC.gif" width="700" /> </p> <p align="center"> Hepatic stellate cell compacting a collagen type I hydrogel over time. </p>

Quantification of tissue compaction around cells

Python package to quantify the tissue compaction (as a measure of the contractile strength) generated by cells or multicellular spheroids that are embedded in fiber materials. For this we provide the two following approaches:

Further, the package can be used to evaluate the strength of allignment (coherency) regardless of the allignment direction, or the allignment orientation towards the x-axis (instead of the cell center). For this, see the point "Save Numpy Files" below.

<img src="../master//docs/images/Fig1-rawtostructure.png" width="1000" />

Installation & Graphical User Intercae

Option A (for python users):

Simply install the CompactionAnalyzer via Pip by running following command in the console:

pip install CompactionAnalyzer

Option B (without python installation):

Alternativley, you can download a standalone ".exe" executable of the 3D TFM software saenopy here.

Graphical User Interface: CompactionAnalyzer is included within the saenopy package. For the Graphical User Interace simply select the "Orientation" tab.

Read More

If you'd like to learn more about the mehtod, have a look at our paper:

Böhringer D, Bauer A, Moravec I, Bischof L, Kah D, Mark C, Grundy TJ, Görlach E, O'Neill GM, Budday S, Strissel PL, Strick R, Malandrino A, Gerum R, Mak M, Rausch M, Fabry B. Fiber alignment in 3D collagen networks as a biophysical marker for cell contractility. Matrix Biology (2023). doi: 10.1016/j.matbio.2023.11.004

Tutorial (saenopy)

If you are using Saenopy with its graphical user interface, simply click and load a example data set.

Tutorial (python API)

To use the python API, the scripts within the tutorial folder might be a good start to get familiar with the analyis: The script Example_CellCollagen.py evaluates 4 example cells that are embedded in collagen and compacted the surrounding collagen. Fiber stucture was recorded using 2nd harmonic imaging and cell outline using calcein staining.

Further scripts CompactionAnalysis_empty_collagen.py & CompactionAnalysis_artificial_data.py evaluate empty collagen gels that show random fiber allignement and artifiacl data with random allignement.

In these script we start to import all necessary functions using

from CompactionAnalyzer.CompactionFunctions import *

For the analysis, we need per cell an image of the fiber structure (e.g. 2nd harmonic, confocal reflection or stained fluorescence images; maximum intensity projection around the cells might be useful) and an image of the cell for segmentation (staining or brightfield).

We define the input data for the fibers using fiber_list_string ant the cells using cell_list_string (here we can utilize the * place holder to selecet multiple images). generate_lists() then searches all specified fiber and cell paths and creates the output subfolder in the specified output_folder directory completley automatically.

output_folder = "C:\user\Analysis_output"                     # output folder that will be created and filled automatically
fiber_list_string =  r"C:\user\imagedata\cell_*\*ch00*.tif"   # input fiber images of all cells 
cell_list_string =  r"C:\user\imagedata\cell_*\*ch01*.tif"    # input stained images of all cells 

fiber_list,cell_list, out_list = generate_lists(fiber_list_string, cell_list_string, output_main =output_folder)

We now want to start the analysis and compute the orientation of individual fibers using structure tensor analysis. Here sigma_tensor is the kernel size that determines the length scale on which the strucutre is analysed. The kernel size should be in the range of the structure-size we want to look at and can be optimized for the individual application. For our fiber gels we use a value of 7 µm, which is in range of the pore size. The script DetermineWindowSize.py in the tutorial folder provides a template to systematically test different windowsizes on the same image pair and from that select the ideal size of the sigma_tensor for this setup (which displays a peak in the orientation).

We can adjust all of the following paramters before starting the analysis. The corresponding pixel scale is set as scale and the segmentiation can be changed by using the segmention_thres or by changing the local contrast enhancement via seg_gaus1, seg_gaus2. With show_segmentation = True we can inspect the segmentation or - if preferred - segment the mask manually by clicking using manual_segmention = True. Further, a maximal distance around the cell center can be specified for the analysis using max_dist.

scale =  0.318                  # imagescale in um per pixel
sigma_tensor = 7/scale          # sigma of applied gauss filter / windowsize for the structure tensor analysis in px
                                # should be in the order of the objects to analyze !! 
                                # 7 um for collagen 
edge = 40                       # Cut off pixels at the edge since values at the border cannot be trusted
segmention_thres = 1.0          # for cell segmentation, thres 1 equals normal otsu threshold , change to detect different percentage of bright pixel
max_dist = None                 # optional: specify the maximal distance around cell center for the analysis (in px)
seg_gaus1, seg_gaus2 = 0.5,100  # 2 gaus filters used as bandpassfilter for local contrast enhancement; For seg_gaus2 = None a single gauss filter is applied
max_dist = None,                # optional: specify the maximal distance around cell center for analysis (in px)
regional_max_correction = True  # background correction using regional maxima approach
show_segmentation = False       # display the segmentation output (script won't run further)
sigma_first_blur  = 0.5         # slight first bluring of whole image before appplying the structure tensor
angle_sections = 5              # size of angle sections in degree 
shell_width =  None             # pixel width of distance shells (px-value=um-value/scale)
manual_segmention = False       # segmentation of mask by manual clicking the cell outline
plotting = True                 # creates and saves individual figures additionally to the excel files 
dpi = 200                       # resolution of figures 
SaveNumpy = False               # saves numpy arrays for later analysis - can create large data files
norm1=1,norm2 = 99              # contrast spreading for input images between norm1- and norm2-percentile; values below norm1-percentile are set to zero and
                                # values above norm2-percentile are set to 1
seg_invert=False                # if segmentation is inverted dark objects are detected inseated of bright objects
seg_iter = 1                    # number of repetitions of  binary closing, dilation and filling holes steps
segmention_method="otsu"        # use "otsu", "entropy" or "yen"  as segmentation method
segmention_min_area = 1000      # small bjects below this px-area are removed during cell segmentation
load_segmentation = False       # if True enter the path of the segementation.npy - file in path_seg
path_seg = None                 # to load a mask
ignore_cell_outline= False      # if True also values at the cell-occupied areas are included in the analysis

Now we start to analyse all our cells individually using the single function StuctureAnalysisMain (follow the scripts in the tutorial folder):

# Start the structure analysis with the above specified parameters
StuctureAnalysisMain(fiber_list=fiber_list,
                     cell_list=cell_list, 
                     out_list=out_list,
                     scale=scale, 
                     sigma_tensor = sigma_tensor , 
                     ...)

For each cell we now receive 3 excel files:

To compare different cells we can utilize e.g the total orientation within a field of view (requires that all cells have the same Field of View) or could also compare the intensity values in the first distance shell(s).

<img src="../master//docs/images//Fig2-orientationeval.png" width="1000" />

If we want to evaluate a measurement containing multiple cells, we can read in all excel files (of individual cells) in the underlying folders of the given data path and combine them in a new excel file by using:

SummarizeResultsTotal(data="Analysis_output", output_folder= "Analysis_output\Combine_Set1")
SummarizeResultsDistance(data="Analysis_output", output_folder= "Analysis_output\Combine_Set1")

Note: These function searches all subfolders for the "results_total.xlsx" and "results_distance.xlsx" files. If you want to discard outliers it might be practical to rename the corresponding files to for example "_results_total.xlsx" and "_results_distance.xlsx")

We receive a compromised excel sheet that returns the global analysis for all cells and another excel sheet with the mean distance analysis. The different columns Mean Angle and Orientation refer to the angular deviation between all orientation vectors to the respective cell center and the hereby resulting orientation. These quantities are weighted by the coherency (orientation strength) and additionally also by both, the coherency and the image intensity. From the different cells we now can calculate different quantities, as for example the mean Orientation (weighted by intensity and coherency) of all cells, which is named Overall weighted Oriantation (mean all cells) and also stored in the same excel file.

<img src="../master/docs/images/Fig3-combineexcel.png" width="800" />

Save Numpy Files

If you want to save the angle or coherency maps for different kind of analysis, you can specify the parameter SaveNumpy = True. Each output folder will then contain among other properties the following npy-files:

<img src="../master//docs/images/npy-files-1.png" width="800" />

FiberImageCrop.npy contains the original input fiber image resized to the size of the output analysis maps. These files have (depending on the argument edge) less pixels because the image edges should be excluded to avoid orientation analysis artifacts (default: 15px). OrientationMap.npy includes the orientation to the cell center (between -1 and 1) as explained above. CoherencyMap.npy can be used as a measure of anisotropy or orientation strength within the image (between 0 and 1), regardless of the specific direction of the orientation. This can be advantageous when the direction varies, such as in images containing multiple cells with stained stress fibers oriented in different directions. Quantifying the average coherence of these stress fibers in the cell occupied area can still provide insight into the strength of the allignment.

<img src="../master//docs/images/npy-files-2.png" width="800" />

AngleDeviationMap.npy includes the deviation of the raw angles with respect to the cell center. These angles are used to calculate the orientation map. The raw angles with respect to the x-axis (but with no respect to the cell center) are included in AngleMap_NoReference.npy (between -90 and 90 degree). They are calculated from the structure vectory in x- and y- direction stored in Vector_min_ax1.npy and Vector_min_ax0.npy. segmentation.npy contains the segmented cell mask, where no analysis is performed in the normal procedure. To compute and store all these numpy fields even in places where the cell is segmented, you can use the argument ignore_cell_outline=True in the analysis pipeline. This will compute these maps within the entire image.

If you want to calculate the fiber coherency CoherencyMap.npy or the angle towards the x-axis AngleMap_NoReference.npy (between -90 and 90 degree) without an image of the cell you can start the analysis by setting cell_list_string = None:

fiber_list, cell_list, out_list = generate_lists(fiber_list_string, None, output_main =output_folder)

This option will start the analysis without the second image pair of cell images and automatically compute all maps within the entire image using the ignore_cell_outline=True argument. The typical orientation map is calculated with respect to the image center.

<img src="../master//docs/images/npy-files-3.png" width="800" />

Graphical User Interface (GUI)

For an easy use of the CompactionAnalyzer, we provide a graphical user interface (GUI) that simplifies the execution and evaluation of several experiments. This Graphical Interface is now included within the Saenopy package here (see Instalation). Just select the "Orientation" tab. Pairs of fiber and cell images can be loaded individually or batchwise by using the *-placeholder.

<img src="../master//docs/images//GUI_readimages.png" width="800" />

Parameters can be configurated and the cell-segmentationen viewed and changed individually per cell. Upon Run the analysis is started and results will be stored in specified output folder.

<img src="../master//docs/images//GUI_evaluation.png" width="800" />

For data analysis, the results_total.xlsx files can be loaded again individually or batchwise by using the *-placeholder. Intensity and Orientation can then be evaluated individually or by adding several cells to user defined groups. Bar and distance plots (mean+-se) are created automatically and individual python-scripts to re-plot the data can be exported.

<img src="../master//docs/images//GUI_DataAnalysis.png" width="800" />