Page 1
Automated Blastomere Segmentation for Visual Servo on Early-Stage Embryo
by
Simarjot Singh Sidhu
A thesis submitted in conformity with the requirements for the degree of Master of Applied Science
Department of Mechanical and Industrial Engineering University of Toronto
ยฉ Copyright by Simarjot Singh Sidhu 2019
Page 2
ii
Automated Blastomere Segmentation for Visual Servo on Early-
Stage Embryo
Simarjot Singh Sidhu
Master of Applied Science
Department of Mechanical and Industrial Engineering
University of Toronto
2019
Abstract
Automation of single biological cell surgery requires the location of organelles and cell structures
to be determined to permit automated processing carried out in the cell surgery process. In this
work, z-stack images of mouse embryos are used as model to develop image processing
algorithms, to determine the centroid position (๐ฅ, ๐ฆ, ๐ง) coordinates of embryo blastomeres.
Transparency of embryos allow for a series of images along the vertical cell axis (๐ง) to be obtained.
Individual z-stack images are processed using 2D image processing steps to first segment, then
estimate the centroid (๐ฅ, ๐ฆ) coordinates of the blastomeres in the 2D image. Successive processing
of all z-stack images then permits the centroid of the blastomeres to be determined in (๐ฅ, ๐ฆ, ๐ง)
coordinates. Image processing-based calibration allows PZD micropipette to move to the
computed centroid position with PBVS control. These algorithms are experimentally verified with
mouse embryos at the 2 blastomere stage of development.
Page 3
iii
Acknowledgments
This work described in the thesis would not have been made possible without the help of several
individuals.
Firstly, I would like to express my sincerest gratitude to my supervisor, Professor James K. Mills
for the continuous support of my research. His patience, support and immense knowledge are
monumental in the completion of my work. Thank you for motivating me to push through in times
of failure.
I would like to thank Professor Goldie Nejat and Professor Pierre E. Sullivan for taking the time
to serve on my committee. I appreciate listening to your thoughts on my work.
Furthermore, this thesis would not have been made possible without the support of my labmates
at the Laboratory for Nonlinear Systems Control (NSCL), Ihab Abu-Ajamieh, Andrew Michalak,
Armin Eshaghi, William Yao and Maharshi Trivedi. It has been a privilege to work with such a
talented group. A special mention goes to Ihab for his great mentorship, and insightful career
advice and guidance. Thank you to my all my friends for providing me encouragement and support
throughout this journey.
Special thanks go to Dr. Christopher Yee Wong and Dr. Steven Kinio for their help in starting my
academic journey, and their guidance in career and academia.
Last but not least, I would like to express my deepest gratitude towards my parents for their
unconditional love, time, support, patience (and food), while supporting me on this endeavour. I
would not be the person I am without them.
Page 4
iv
Table of Contents
Acknowledgments.......................................................................................................................... iii
Table of Contents ........................................................................................................................... iv
List of Tables ................................................................................................................................. vi
List of Figures ............................................................................................................................... vii
List of Appendices ......................................................................................................................... xi
Nomenclature ................................................................................................................................ xii
Introduction .................................................................................................................................1
1.1 Preimplantation Genetic Diagnosis......................................................................................1
1.2 Automation of Single Cell Surgery......................................................................................2
1.3 Problem Statement and Objectives ......................................................................................3
1.3.1 Problem Statement ...................................................................................................3
1.3.2 Objectives ................................................................................................................4
1.4 Contributions........................................................................................................................5
1.5 Thesis Organization .............................................................................................................5
Background and Literature Review ............................................................................................7
2.1 Overview of Image Processing Techniques .........................................................................7
2.1.1 Types of Microscopy ...............................................................................................7
2.1.2 Depth of Field ..........................................................................................................9
2.1.3 Image Processing and Cell Segmentation ..............................................................12
2.2 Overview of Visual Servoing Techniques .........................................................................15
2.2.1 Introduction to Visual Servoing .............................................................................15
2.2.2 Image-Based Visual Servo .....................................................................................16
2.2.3 Position-Based Visual Servo..................................................................................16
Page 5
v
Methodology .............................................................................................................................19
3.1 z-Stack Images ...................................................................................................................19
3.1.1 Obtaining z-Stack Images ......................................................................................21
3.2 Blastomere Segmentation ..................................................................................................22
3.2.1 Initialization (Step 1) .............................................................................................22
3.2.2 Low-Cost Energy Path (Step 2) .............................................................................29
3.2.3 Blastomere Centroid Calculation (Step 3) .............................................................41
3.3 Visual Servoing ..................................................................................................................44
3.3.1 Micropipette Calibration ........................................................................................45
3.3.2 Position Based Visual Servoing .............................................................................49
Results and Discussion ..............................................................................................................53
4.1 Experimental Procedure .....................................................................................................53
4.2 Experimental Results .........................................................................................................56
4.3 Discussion of Results .........................................................................................................59
Conclusions ...............................................................................................................................62
5.1 Summary and Conclusions ................................................................................................62
5.2 Contributions......................................................................................................................64
5.3 Recommendations and Future Works ................................................................................64
References ......................................................................................................................................65
Appendices .....................................................................................................................................71
Page 6
vi
List of Tables
Table 1: Sample Blastomere Coordinate Data .............................................................................. 72
Table 2: Sample Blastomere Coordinate Calculations ................................................................. 73
Table 3: Sample Blastomere Coordinate Calculation Errors ........................................................ 73
Table 4: Sample Given Micropipette Tip Coordinates ................................................................. 74
Table 5: Sample True Micropipette Tip Coordinates ................................................................... 74
Table 6: Sample Micropipette Tip Coordinate Errors .................................................................. 74
Page 7
vii
List of Figures
Figure 1.1: Development of Early-Stage Embryo [5]. .................................................................... 2
Figure 1.2: Preexisting Experimental Setup: (a) Nikon Ti-U brightfield inverted microscope and
(b) Scientifica Patchstar robotic micromanipulators and Prior Proscan III motorized stage. ......... 4
Figure 2.1: Images captured by brightfield microscopy and fluorescence, respectively.
Blastomeres are dyed red, and nuclei are dyed green [17]. ............................................................ 8
Figure 2.2: Various brightfield microscopy techniques: (a) 12-cell stage embryo captured with DIC
[28], (b) zygote stage embryo captured with HMC [32], (c) 4-cell stage embryo captured with
HMC [33]. ....................................................................................................................................... 9
Figure 2.3: Images showing tetrahedral shape of 4-cell stage embryo: (a) Image focused at bottom
two blastomeres of embryo [38]. (b) Image focused at top two blastomeres of embryo [38]. ..... 10
Figure 2.4: Diagram of 2-cell stage embryo with blastomeres, and centroid of the blastomere, ๐ถ๐.
....................................................................................................................................................... 11
Figure 2.5: Diagram of embryo placed on motorized stage of a microscope. The movement axes
of both the stage and objective lens are labelled. .......................................................................... 11
Figure 2.6: Active contour segmentation of zona pellucida: (a) Original image [40]. (b) Active
contour segmentation [40]. (c) Manual segmentation as reference [40]. ..................................... 13
Figure 2.7: Variational Level Sets for Cell Segmentation: (a) manual segmentation [41]. (b)
Blastomeres within the ZP, with bounding curves [41]. ............................................................... 13
Figure 2.8: Z-stack images of embryo: Top row: original Z-stack images obtained [33]. Middle
row: blastomeres segmented with graph-based method [33]. Segmented contours marked in
yellow. Bottom row: reconstructed 3D structure of blastomeres [33]. ......................................... 14
Figure 2.9: Demonstration of IBVS: (a) Initial coordinates of features, marked as yellow dots, and
(b) Desired coordinates of features, marked as red dots. .............................................................. 17
Page 8
viii
Figure 2.10: Controlling position coordinates to move to desired location with PBVS............... 17
Figure 3.1: Diagram of showing axes of motion, and the Cartesian reference frame. The motorized
stage moves along the xy-plane. The objective lens moves along the z-axis. .............................. 19
Figure 3.2: Diagram of Image Stack and z-Stack Images. The z-stack image outlined in red is the
z-stack image of interest (IOI). The two z-stack images outlined in blue are involved in the process
to create the image array, ๐ฝ, which is further explained in Section 3.2.2 below. .......................... 20
Figure 3.3: Z-stack images of embryo. (a) Embryo with z-stack images taken successively at
equally separated focal planes. (b) Individual z-stack images stacked to indicate what part of the
blastomere is taken at what z-stack image. Also samples the format of TIF files. ....................... 21
Figure 3.4: The original image of the embryo, selected from the middle of the image stack [15].
....................................................................................................................................................... 23
Figure 3.5: Image with standard deviation filter applied. ............................................................. 24
Figure 3.6: The thresholded binarized image. .............................................................................. 25
Figure 3.7: Image with area filter applied. .................................................................................... 25
Figure 3.8: Image with area fill applied. ....................................................................................... 26
Figure 3.9: Image smoothened by a structuring element, resulting in a blob containing the two
blastomeres. .................................................................................................................................. 26
Figure 3.10: Image processing algorithms to acquire approximate centroids. (a) Blob acquired
from previous step Figure 3.9. (b) Calculated centroid of blob represented as a blue *. (c) Line
from blob centroid to closest edge. (d) Segmented blastomeres from line cut. (e) Centroids of
respective blastomeres, with the centroids represented as a blue *. ............................................. 28
Figure 3.11: z-Stack image with ROI around BOI. ...................................................................... 29
Figure 3.12: ROI displayed in polar coordinates at the z-stack IOI, ๐ฝ๐. ...................................... 30
Figure 3.13: Format of image array, ๐ฝ. ......................................................................................... 31
Page 9
ix
Figure 3.14: Energy array at the z-stack image of interest, ๐ธ๐. ................................................... 32
Figure 3.15: Basic graph structure example. ................................................................................ 33
Figure 3.16: Basic graph structure path example.......................................................................... 34
Figure 3.17: Sample of 2D graph structure of the energy z-stack image, ๐ธ๐. .............................. 35
Figure 3.18: Sample of 3D graph structure, ๐ธ(๐๐, ๐๐, ๐). .......................................................... 36
Figure 3.19: Components of 3D Graph Structure Complexity. .................................................... 37
Figure 3.20: Graph showing number of permutations, ๐๐, vs. number of z-stack images within
the graph for the energy matrix, ๐ธ. ............................................................................................... 38
Figure 3.21: Sparse Matrix where ๐ = 1, or ๐ = 3. ..................................................................... 39
Figure 3.22: Low-cost energy path. (a) Path ๐ค๐ at ๐ธ๐ โ 1. (b) Path ๐ค๐ at ๐ธ๐. (c) Path ๐ค๐ at ๐ธ๐ + 1.
(d) Path ๐ค๐ projected onto the xy-plane, ๐พ๐. .................................................................................. 40
Figure 3.23: Computed path, ๐พ๐, represented by the red line. And centroid, ๐ถ๐, represented by a red
*, of the z-stack IOI of ๐ผ๐. ............................................................................................................. 40
Figure 3.24: Diagram of the z-stack image centroids, ๐ถ๐, area ๐ด๐, and computed blastomere
centroid, ๐ถ. .................................................................................................................................... 42
Figure 3.25: Flowchart of Blastomere Segmentation Algorithm. The orange section represents the
manual operations required to begin the automated task, whereas the blue sections represent the
automated tasks. Statements in green represent the output for its respective step. ...................... 43
Figure 3.26: Schematic of the experimental setup. ....................................................................... 45
Figure 3.27: Micropipette image segmentation. (a) Original image of micropipette. (b) Canny edge
detection. (c) Image Fill. (d) Micropipette outline split into side walls and tip. (e) Micropipette
with orientation and tip position. .................................................................................................. 47
Figure 3.28: Micropipette Calibration Procedure. (a) Micropipette at first position. (b)
Micropipette at second position. (c) Micropipette at calibration test position. ............................ 49
Page 10
x
Figure 3.29: Micropipette Control Path. (a) Micropipette at second position, ๐๐๐, 2. (b)
Micropipette at third position, ๐๐๐, 2. (c) Micropipette at fourth, and final, position, ๐๐๐, 4. .. 51
Figure 4.1: Flowchart of overall BOI centroid computation and visual servo process. The boxes
represent tasks, whereas the arrows represent the procession from one task to another. Orange
boxes and arrows represent tasks performed manually. Whereas the blue boxes and arrows
represent automatically performed tasks. ..................................................................................... 55
Figure 4.2: Sample Experiment of Visual Servoing. .................................................................... 58
Figure 4.3: Various z-stack images of embryo. (a) z-stack image at ๐ผ14. Note the white circularly
shaped outline within the embryo. This is the boundary of the blastomere at this z-stack image. (b)
z-stack image at ๐ผ31. Also used as the middle of image stack due to it being the z-stack image with
the largest blastomere boundary. (c) z-stack image at ๐ผ44. Blastomere boundary is not visible due
to blastomere opacity. ................................................................................................................... 60
Figure 4.4: Comparison of Micropipette Tips for Calibration. ..................................................... 61
Page 11
xi
List of Appendices
Appendix A. Experiment of Visual Servoing
Appendix B. Sample Blastomere Coordinate Calculations
Appendix C. Sample Micropipette Tip Coordinate Calculations
Appendix D. Sample of Blastomere Segmentation Across Image Stack
Page 12
xii
Nomenclature
Abbreviations
3D Three Dimensions/Dimensional
ART Assisted Reproductive Technologies
BOI Blastomere of Interest
CAD Canadian Dollar
DIC Differential Interference Contrast
DoF Depth of Field (Depth of Focus)
DOF Degrees of Freedom
GPS Global Positioning System
HMC Hoffman Modulation Contrast
IBVS Image-Based Visual Servo Control
ICSI Intracytoplasmic Sperm Injection
IOI z-Stack Image of Interest
IVF In Vitro Fertilization
NA Numerical Aperture
OQM Optical Quadrature Microscopy
PBVS Position-based Visual Servo Control
PGD Preimplantation Genetic Diagnosis
PZD Partial Zona Dissection
ROI Region of Interest
TIF Tagged Image Format Filetype
USD United States Dollar
ZP Zona Pellucida
Microscopy
๐ Sample Node
๐ด๐ ๐th Index of Area of ๐พ๐
๐ Sample Node
Page 13
xiii
๐ Sample Node
๐ถ๐ Centroid Approximation used for ROI Initialization
๐๐๐ฅ x-Coordinate of Centroid Approximation used for ROI Initialization
๐๐๐ฆ y-Coordinate of Centroid Approximation used for ROI Initialization
๐ถ๐๐๐๐ Centroid of Blob
๐ถ๐ Centroid of ๐พ๐
๐๐๐ฅ x-Coordinate of Centroid of ๐พ๐
๐๐๐ฆ y-Coordinate of Centroid of ๐พ๐
๐ถ๐ Calculated Centroid by Manual Segmentation
๐๐๐ฅ x-Coordinate of Calculated Centroid by Manual Segmentation
๐๐๐ฆ y-Coordinate of Calculated Centroid by Manual Segmentation
๐๐๐ง z-Coordinate of Calculated Centroid by Manual Segmentation
๐ถ๐ True Centroid of BOI
๐ถฬ
Calculated Centroid of BOI
๐ Sample Node
๐๐ผ Distance between Consecutive z-Stack images
๐ Sample Node
๐ธ Energy Array
๐ธ๐ Energy Value at Node b
๐ธ๐ Energy Value at Node c
๐ธ๐ Energy Value at Node d
๐ธ๐ ๐th Index of Energy Array
๐ธ๐ ๐th Index of Energy Array
๐ธ๐ ๐th Index of Energy Array
๐๐ฅ Error along x-axis
๐๐ฆ Error along y-axis
๐๐ง Error along z-axis
๐ Sample Node
๐น๐กโ๐๐๐ โ Threshold of Binarized Filter
๐ Sample Node
๐บ๐ Gradient Operator along Radial Direction
Page 14
xiv
๐ Indexing Variable
๐ผ๐ ๐th Index of z-Stack Image of an Image Stack
๐ผ๐ผ Index of z-Stack Image of an Image Stack where BOI is Largest
๐ Indexing Variable
๐ฝ Image Array
๐ฝ๐ ๐th Index of Image Array
๐ Indexing Variable
๐พ Scaling Factor for Low-Cost Energy Path Formula
๐ Number of z-Stack Images used for Graph Structure
๐๐ Total Visual Magnification of Microscope
๐ Refractive Index of Medium
๐ Number of z-Stack Images in an Image Stack
โ+ Positive Natural Numbers
๐๐ด Numerical Aperture of Objective Lens
๐ Sigmoid Function
๐๐ Permutations of Graph Structure
๐ฅ๐ True x-coordinate of BOI
๐ฆ๐ True y-coordinate of BOI
๐ง๐ True z-coordinate of BOI
๏ฟฝฬ
๏ฟฝ x-Coordinate of Calculated Centroid of BOI
๏ฟฝฬ
๏ฟฝ y-Coordinate of Calculated Centroid of BOI
๐ง๐ z-Coordinate at z-stack Image ๐ผ๐
๐งฬ
z-Coordinate of Calculated Centroid of BOI
๐ผ Direction of Lighting from HMC Imaging
โ Belongs to (Mathematical Operator)
๐ค๐ ๐th Index of 3D Path for Blastomere Segmentation
๐พ๐ ๐th Index of 2D Projection on xy-plane of ๐ค๐
ฮป Wavelength of Light Used
๐ Radii of Ring for ROI
๐โฒ Radii of Inner Ring of ROI
๐โฒโฒ Radii of Outer Ring of ROI
Page 15
xv
๐๐ Number of ๐ Samples for ๐ฝ
๐ Angle for use in Polar Coordinates of ๐ฝ
๐๐ Number of ๐ Samples for ๐ฝ
Visual Servoing
๐ Set of Parameters representing additional knowledge about the System
๐ถ๐ฟ Centroid of Left Micropipette Side Wall
๐ถ๐
Centroid of Right Micropipette Side Wall
๐ถฬ
Calculated Centroid of BOI
๐ Error between Features
๐ Indexing Variable
๐ Indexing Variable
๐ Indexing Variable
๐ Set of Image Measurements
๐ Total Number of Points of ๐ ๐๐๐๐๐๐๐
๐ Micropipette Tip Position
๐๐๐๐๐ ๐๐ก Offset of ๐ from Micromanipulator Frame to Camera Frame
๐๐ถ Micropipette Tip Position for Calibration in Camera Frame
๐๐ Micropipette Tip Position for Calibration in Micromanipulator Frame
๐1 First Micropipette Tip Position for Calibration
๐1๐ถ First Micropipette Tip Position in Camera Frame
๐1,๐ฅ๐ถ x-Component of ๐1
๐ถ
๐1,๐ฆ๐ถ y-Component of ๐1
๐ถ
๐1๐ First Micropipette Tip Position in Micromanipulator Frame
๐1,๐ฅ๐ x-Component of ๐1
๐
๐1,๐ฆ๐ y-Component of ๐1
๐
๐2 Second Micropipette Tip Position for Calibration
๐2๐ถ Second Micropipette Tip Position in Camera Frame
๐2,๐ฅ๐ถ x-Component of ๐2
๐ถ
๐2,๐ฆ๐ถ y-Component of ๐2
๐ถ
Page 16
xvi
๐2๐ Second Micropipette Tip Position in Micromanipulator Frame
๐2,๐ฅ๐ x-Component of ๐2
๐
๐2,๐ฆ๐ y-Component of ๐2
๐
๐๐๐ Position of Micropipette Tip for Visual Servoing
๐๐๐,1 First Position of Micropipette Tip for Visual Servoing
๐๐๐,2 Second Position of Micropipette Tip for Visual Servoing
๐๐๐,3 Third Position of Micropipette Tip for Visual Servoing
๐๐๐,4 Fourth Position of Micropipette Tip for Visual Servoing
๐
Rotation Matrix
๐ Scaling Factor
๐ Current Set of Features
๐โ Desired Set of Features
๐ ๐ฟ Set of Points along Left Micropipette Side Wall
๐ ๐
Set of Points along Right Micropipette Side Wall
๐ ๐๐๐๐๐๐๐ Set of Points along Perimeter of Micropipette
๐ ๐๐๐๐๐๐๐,๐ฅ x-Component of ๐ ๐๐๐๐๐๐๐
๐ ๐๐๐๐๐๐๐,๐ฆ y-Component of ๐ ๐๐๐๐๐๐๐
๐ Time
T Transformation Matrix
๐ฅ x-Coordinate of Micropipette Tip
๐ฅ1 x-Coordinate of First Micropipette Tip Position for Visual Servoing
๐ฅ2 x-Coordinate of Second Micropipette Tip Position for Visual Servoing
๐ฅ3 x-Coordinate of Third Micropipette Tip Position for Visual Servoing
๐ฅ4 x-Coordinate of Fourth Micropipette Tip Position for Visual Servoing
๐ฅ๐ธ Error of Tip along x-Direction
๐ฅ๐บ Sample Given x-Coordinate
๐ฅ๐ x-Coordinate of True Micropipette Tip Position
๐ฆ๐ธ Error of Tip along y-Direction
๐ฆ๐บ Sample Given y-Coordinate
๐ฆ๐ y-Coordinate of True Micropipette Tip Position
๐ง๐ธ Error of Tip along z-Direction
Page 17
xvii
๐ง๐บ Sample Given z-Coordinate
๐ง๐ z-Coordinate of True Micropipette Tip Position
๏ฟฝฬ
๏ฟฝ x-Coordinate of Calculated Centroid of BOI
๐ฆ y-Coordinate of Micropipette Tip
๐ฆ1 y-Coordinate of First Micropipette Tip Position for Visual Servoing
๐ฆ2 y-Coordinate of Second Micropipette Tip Position for Visual Servoing
๐ฆ3 y-Coordinate of Third Micropipette Tip Position for Visual Servoing
๐ฆ4 y-Coordinate of Fourth Micropipette Tip Position for Visual Servoing
๏ฟฝฬ
๏ฟฝ y-Coordinate of Calculated Centroid of BOI
๐ง z-Coordinate of Micropipette Tip
๐ง1 z-Coordinate of First Micropipette Tip Position for Visual Servoing
๐ง2 z-Coordinate of Second Micropipette Tip Position for Visual Servoing
๐ง3 z-Coordinate of Third Micropipette Tip Position for Visual Servoing
๐ง4 z-Coordinate of Fourth Micropipette Tip Position for Visual Servoing
๐ง๐๐๐๐ ๐๐ก Offset of ๐ง from Micromanipulator Frame to Camera Frame
๐ง๐ถ z-Coordinate of Micropipette Tip in Camera Frame
๐ง๐ z-Coordinate of Micropipette Tip in Micromanipulator Frame
๐งฬ
z-Coordinate of Calculated Centroid of BOI
๐ผ Angle of Micropipette
๐ผ๐๐๐๐ก Initial Estimate of Micropipette Angle
๐ผ๐๐๐๐ก,๐ฟ Micropipette Left Wall Angle for Initialization
๐ผ๐๐๐๐ก,๐
Micropipette Right Wall Angle for Initialization
๐ฝ๐ถ Angle for Rotation Matrix in Camera Frame
๐ฝ๐ Angle for Rotation Matrix in Micromanipulator Frame
๐ฟ Sampling Length
ํ Number of Points used for Angle Estimation
โ Belongs to (Mathematical Operator)
Page 18
1
Introduction
1.1 Preimplantation Genetic Diagnosis
In the healthcare industry, technology is progressing at a rapid rate. Advancements are being made
to further develop technologies towards the micro and cellular scale. These developments are
necessary for the means of manipulation of individual cells and their intracellular components.
Specifically, the field of assisted reproductive technologies (ART), requires the use of these
technologies to help with fertility and reproduction related issues. One such type of ART
procedures requires manipulation of embryos at an early stage of development, within a few days
of fertilization, also known as in vitro fertilization (IVF).
In vitro fertilization involves an unfertilized cell, known as an oocyte, to be taken out of the
organism for procedures such as intracytoplasmic sperm injection (ICSI) [1], and preimplantation
genetic diagnosis (PGD) [2], as opposed to in vivo fertilization, in which the oocyte remains within
the organism [3]. The procedure for an IVF procedure is as follows. An oocyte is first removed
from the organism. ICSI involves using a small sharp needle, also called a micropipette, to
inseminate an oocyte. Once inseminated, the fertilized cell, known as an embryo (or zygote), is
stored in an incubator, mimicking the temperatures and CO2 levels inside of the organism, and
begins developing. Initially, the embryo starts as a single celled zygote. Every day, it advances to
a new stage, where in the first day the intercellular material splits into a 2-cell stage. In the
subsequent days, the material splits into a 4-cell, and then an 8-cell stage [4], etc.. These split
intercellular components are known as blastomeres, and are vital to the PGD process [2]. The
embryo then develops into the 16-32 cell stage (morula), and then a blastocyst. These stages can
be seen in Figure 1.1 [5]. Only then is it transferred back into the organism for further natural
development.
Page 19
2
Figure 1.1: Development of Early-Stage Embryo [5].
The preimplantation genetic diagnosis process, is a method used by embryologists, to perform IVF
treatments, for genetic testing purposes [2]. 1-2 blastomeres at the 2-cell, 4-cell, or 8-cell stages
are extracted for genetic analysis. These analyses may be used to diagnose genetic diseases, such
as autosomal-dominant disorders, such as Huntington disease and Marfan syndrome, and
autosomal-recessive disorders, such as cystic fibrosis and sickle cell disease [2], [6]. Manual PGD
processes performed by embryologists have a low rate of success, hovering at around 30% [7].
The average cost for an IVF treatment ranges from approximately $10,000 to $20,000 (CAD) per
IVF cycle in Canada [8]. The cost per success for cycle-based IVF treatment nears $50,000 (USD)
in the United States [9]. There is a need to both lower the cost for IVF patients, and vastly improve
the success rate of this process.
1.2 Automation of Single Cell Surgery
The development of automating single biological cell surgery is an effective approach to resolve
this problem. Automation of cell surgery tasks has the potential to provide for a robust and
Page 20
3
repeatable procedure, allowing for higher success rates of IVF treatments. The automated
processes operate without the drawback of operator fatigue experienced by embryologists.
Automated processes also reduce human contamination with the embryo, and reduce the time
elapsed for the embryo is outside of the host. Automation in this field has the advantage of
increased throughput and speed of performing cell surgery tasks, and is designed with the primary
goal of increased success rates. In recent years, several advancements have been made to automate
these IVF and PGD tasks, such as embryo rotation [10], [11], micropipette control [10], [12] and
cell aspiration [13]. Hence, automation can be a less expensive, foster greater use, providing an
alternative to standard procedures now used in IVF.
1.3 Problem Statement and Objectives
1.3.1 Problem Statement
An important step for automating single cell surgery is determining where individual biological
cells, and their intracellular components, are in 3D Cartesian space. In particular, blastomeres
within early-stage embryos, are in general not located in the same focal plane as each other, and
could pose a problem in automating the detection of such blastomeres during tasks, such as
blastomere aspiration for PGD processes [14]. Microscopes also possess a shallow depth of focus,
which results in only relatively thin layers of the blastomere to be in focus at any instant in time.
In some cases of automated blastomere extraction, due to the limited depth of focus, the entire
blastomere would travel along the z-direction away from the focal plane, and hence fails to perform
the given task [15]. The knowledge of 3D coordinate location data is vital to successfully complete
automated single-cell surgery tasks. This coordinate data is important for automating and operating
image-based processes, such as visual servo control, particularly position-based visual servo
(PBVS) control, so that blastomere related tasks, such as aspiration, may successfully be carried
out.
Page 21
4
1.3.2 Objectives
For the purpose of this thesis, the developed algorithms must be able to compute and determine
the centroid coordinates of a blastomere of interest (BOI) within an embryo in 3D Cartesian space,
and then move a micropipette to the computed position using PBVS control, for blastomere
aspiration and extraction purposes. Furthermore, the proposed algorithms must integrate with the
preexisting Nikon Ti-U brightfield inverted microscope setup [Figure 1.2(a)], equipped with two
robotic micromanipulators and a motorized stage [Figure 1.2(b)].
Figure 1.2: Preexisting Experimental Setup: (a) Nikon Ti-U brightfield inverted microscope and
(b) Scientifica Patchstar robotic micromanipulators and Prior Proscan III motorized stage.
Page 22
5
1.4 Contributions
In this work, the following contributions are made:
1. The research proposes a method to obtain image data from across the z-direction of an
embryo, known as z-stack images.
2. From obtained z-stack images, the research proposed an automated image processing
procedure to determine the centroid of a BOI, and a visual servo procedure to move a
micropipette to this computed centroid position.
3. The proposed research integrates with the existing Nikon Ti-U brightfield microscope
setup, equipped with two robotic micromanipulators and a motorized stage.
1.5 Thesis Organization
The remainder of the thesis is divided into four chapters. Chapter 2 presents a background and a
literature review of the proposed research. This includes a background of image processing
techniques, such as microscopy, depth of field and image processing and cell segmentation, as
outlined in Section 2.1. The literature review then introduces visual servo techniques, and
compares the two types, image-based, and position-based visual servo control. This is introduced
in Section 2.2.
The methodology proposed in this research is detailed in Chapter 3. This chapter includes the
image acquisition from a brightfield microscope and introduces the concept of z-stack images and
the image stack, as outlined in Section 3.1. With the acquired image stack, Section 3.2 details the
proposed 3D image processing algorithms for computing the centroid of the BOI. The image
processing procedures in the section involve algorithms to determine a region of interest from the
z-stack image for subsequent steps, and the construction of a graph structure to produce a low-cost
energy path for segmentation of the BOI at the z-stack image. With the blastomeres segmented at
every z-stack image of the image stack, the 3D Cartesian coordinates of the BOI centroid is
calculated. Section 3.3 details the visual servo procedure in order to move a micropipette to the
target position, the computed centroid of the BOI. Starting with an image processing-based
Page 23
6
micropipette calibration, a method is then described to move the micropipette to the computed BOI
centroid position.
Chapter 4 presents a guide on the acquisition of results from the proposed algorithms in this
research, from a userโs perspective. It details the experimental procedure to acquire results, as
Section 4.1. The data for sample experiments are displayed and the proposed algorithms are
validated for accuracy in Section 4.2. The results are then discussed in Section 4.3, along with
limitations of the proposed algorithms.
Lastly, Chapter 5 concludes by summarizing the thesis, proving contributions of the research,
along with recommendations for future work.
Page 24
7
Background and Literature Review
In this chapter, a literature review of the research is presented. The review is separated into two
main parts. Section 2.1 introduces an overview of image processing techniques. Starting from
providing various types and limitations of microscopy, such as the limited depth of field, to various
methods used for cell segmentation. Section 2.2 introduces an overview of visual servoing, and its
two types; image-based and position-based visual servo and provides a need for their use in the
automation of single cell surgery.
2.1 Overview of Image Processing Techniques
2.1.1 Types of Microscopy
To automate the task of calculating the 3D coordinates of blastomeres within embryos, image
processing is necessary, especially when using a brightfield microscope to acquire images.
Brightfield microscopes are ideal due to their simplicity of setup, and for observing living cells.
The limitation is in their low contrast when used to observe biological cells, particularly with
translucent cells, such as embryos [16]. The image contrast can be enhanced either physically,
using fluorescence techniques, or by software, using image processing techniques, or some
combination of the two. Tsichlaki and FitzHarris were able to dye the embryo to measure the
volume of the nuclei of the blastomeres, while simultaneously dying the blastomeres [17]. Figure
2.1 shows the images of embryos through various stages of development, with both the brightfield
microscope, and an image captured with a confocal microscope under fluorescent imaging. Note
the stark differences in contrast between the two techniques.
Confocal microscopes are capable of segmenting intracellular components, such as blastomeres,
in 3D space, as long as the subjects are dyed [16], [18]. An optical coherence tomography method
is also useful for rapidly acquiring 3D models of embryos [19]. Moreover, segmentation with
confocal microscopes requires the use of raster scanning, a type of sequential scanning, which may
take long periods of time [15], [20]. Using fluorescence imaging in confocal microscopy exposes
Page 25
8
cell to risks due to toxic dyes, or bleaching. Current research is ongoing to minimize the damage
to cells from fluorescent dyeing [21]โ[25].
Figure 2.1: Images captured by brightfield microscopy and fluorescence, respectively.
Blastomeres are dyed red, and nuclei are dyed green [17].
There are methods to improve the contrast of unstained biological samples captured by brightfield
microscopes to better be able to detect translucent bodies, such as the blastomeres and zona
pellucida (ZP) of embryos. For example, differential interference contrast (DIC) is based on the
principle of wave interference, similar to the equipment used in detecting gravitational waves with
LIGO [26], [27]. DIC uses the wave interference property to accentuate the outlines of the
intracellular components, such as blastomeres, generating higher contrast images. Newmark et al.
is able to count the number of blastomeres within an embryo using DIC in conjunction with optical
quadrature microscopy (OQM) [28]. Figure 2.2(a) shows an image captured with a brightfield
microscope with the DIC technique. Soll et al. was able to successfully track and analyze the
motility of organelles in 3D with DIC [29]. Similar to DIC, Hoffman modulation contrast (HMC),
also provides a method to accentuate the outer edges of the cells, and is often used for this purpose
[30], [31]. Giusti et al. captured images of a zygote stage of the embryo with using HMC, as shown
in Figure 2.2(b) [32]. Giusti et al. was able to accurately segment the zygote using a graph-based
method and recover the cell contour of the zygote boundary due to the high contrast of the captured
HMC image. Giusti et al. was also able to segment 4-cell stage embryos using the same graph-
based method, however with the addition of image stacks (also known as focus stacks, z-stack
images) [33]. With this method, they were able to segment the 4 blastomeres with a success rate
of 71.3%.
Page 26
9
Figure 2.2: Various brightfield microscopy techniques: (a) 12-cell stage embryo captured with DIC
[28], (b) zygote stage embryo captured with HMC [32], (c) 4-cell stage embryo captured with
HMC [33].
2.1.2 Depth of Field
There exists another problem when imaging cells, due to the optical properties of microscopes,
specifically their low depth of focus. Embryos have a diameter of approximately 100 ฮผm, and
blastomeres have a diameter ranging from approximately 25 to 75 ฮผm [34], based on the embryos
stage of development. At these microscopic scales, the objective lenses of the microscope possess
a very shallow depth of field (DoF) (also commonly known as depth of focus). The DoF is
dependent on the objective lenses' physical properties, namely itโs magnification, numerical
aperture (NA), and wavelength of light used [35]. Berekโs formula is an equation that measures
the DoF of objective lenses, as shown as (2.1) [35].
๐ท๐๐น = ๐ (ฮป
2 โ ๐๐ด2+
340
๐๐ โ ๐๐ด) (2. 1)
Where ๐๐ is the total magnification, ฮป is the wavelength of light used, and ๐ is the refractive
index of the medium in which the object is situated in. Depending on the objective lens, the DoF
may vary from 5-20 ฮผm at these scales. The entire blastomere, or even the embryo will not be in
focus in a single image. Early stage embryos at the 2-cell stage tend to have its blastomeres oriented
such that they lie on a plane, parallel to the stage it is sitting upon [36]. Embryos at the 4-cell stage
tend to be oriented in a tetrahedral pattern for the majority of the time, at >80% [36], [37]. When
blastomeres are arranged in this tetrahedral pattern, they do not lie in the same plane, thus making
Page 27
10
it difficult to obtain in 3D Cartesian space, as expressed in Figure 2.3 [38]. The very limited depth
of focus poses a challenge in locating the blastomeres in 3D Cartesian space.
Figure 2.3: Images showing tetrahedral shape of 4-cell stage embryo: (a) Image focused at bottom
two blastomeres of embryo [38]. (b) Image focused at top two blastomeres of embryo [38].
The position coordinates of the blastomere of interest (BOI) are 3D Cartesian coordinates, i.e. ๐ฅ, ๐ฆ,
and ๐ง coordinates. The centroid of the BOI, ๐ถ๐ = (๐ฅ๐ , ๐ฆ๐ , ๐ง๐), where, ๐ถ๐ is the true centroid
position of the BOI, and ๐ฅ๐, ๐ฆ๐, and ๐ง๐ are the true centroid coordinate components along axes ๐ฅ,
๐ฆ, and ๐ง respectively. A diagram of the embryo with ๐ถ๐ is labelled and shown in Figure 2.4. This
provides an accurate measurement of the centroid of the blastomere, necessary for blastomere
aspiration or extraction purposes. Locating this centroid determines where the blastomere lies in
3D Cartesian space. Most, if not all IVF experimental setups are such that the microscope camera
observes the embryo in the ๐ฅ๐ฆ-plane. Since brightfield microscopes take images in 2D, image
processing algorithms are required to obtain ๐ฅ๐ and ๐ฆ๐ coordinates. The camera must move along
the ๐ง-axis in order to obtain the ๐ง๐ coordinate. A diagram of this setup is shown in Figure 2.5.
Page 28
11
Figure 2.4: Diagram of 2-cell stage embryo with blastomeres, and centroid of the blastomere, ๐ถ๐.
Figure 2.5: Diagram of embryo placed on motorized stage of a microscope. The movement axes
of both the stage and objective lens are labelled.
Page 29
12
2.1.3 Image Processing and Cell Segmentation
There are various methods to obtain the desired ๐ง-coordinate of the features of a blastomere
observed under a brightfield microscope, ๐ง๐. Ideally, ๐ง๐ is located where the blastomere is widest,
and the blastomere boundary is clearest when the focal plane of the microscope lies at this part of
the embryo. Assuming a spherical shape, this is also at the center of the blastomere. One method
to obtain ๐ง๐ being software-based auto-focusing. Bahadur and Mills were able to maintain focus
at a targeted position within embryos with an autofocusing technique using bare-bones particle
swarm optimization and Gaussian jumps [38]. The technique involved obtaining the sharpness
values, in terms of standard deviation, of several points along the ๐ง-axis, and finding the ๐ง-
coordinate that gave the largest sharpness value. A similar approach with auto-focusing was
performed by Wang et al. to focus on and detect polar bodies of oocytes [39].
As mentioned in 2.1.2, to obtain the ๐ฅ๐ and ๐ฆ๐ coordinates of the BOI, image processing
techniques are required. Image segmentation creates a mask in which an object of interest can be
selected for further analysis. Segmentation may be performed through basic image processing
techniques, such as binary thresholds, standard deviation and Gaussian filters, area and perimeter
filters, and image smoothening. However, due to the complexity of the embryo and blastomere
structure, such as overlapping images of blastomeres and a large number of image artifacts, it is
better to use more advanced image processing techniques for segmentation. One such method is
active contours, used by Morales et al. to segment the zona pellucida of embryos, as shown in
Figure 2.6 [40]. However, this requires a manual initialization of parameters, such as defining the
foreground and backgrounds of a specimen, and defeats the purpose of automating the task of
determining the blastomere centroid coordinates. Level sets also provide a way to determine ๐ฅ๐
and ๐ฆ๐, as obtained from the manually segmented 2D contours by Pedersen et al, as seen in Figure
2.7 [41], [42]. Pedersen et al. was also about to approximate the model of the embryo with the
variational level set approach [41], [42]. Giusti et al. employs a graph-based segmentation method
to segment zygotes [33]. After initializing a Region of Interest (ROI), a low-cost, gradient-based,
and graph-based algorithm is run to calculate the outer edge of the zygote. This may also be used
to find the BOI. Giusti et al. also explores the acquired and analyzed ๐ง-stack images to obtain 3D
morphology measurements of early-stage embryos, as shown in Figure 2.8 [33]. Other advanced
segmentation techniques involve Canny and Sobel edge detection algorithms, watershed and Otsu
methods, and even machine learning and neural networks [43].
Page 30
13
Figure 2.6: Active contour segmentation of zona pellucida: (a) Original image [40]. (b) Active
contour segmentation [40]. (c) Manual segmentation as reference [40].
Figure 2.7: Variational Level Sets for Cell Segmentation: (a) manual segmentation [41]. (b)
Blastomeres within the ZP, with bounding curves [41].
Page 31
14
Figure 2.8: Z-stack images of embryo: Top row: original Z-stack images obtained [33]. Middle
row: blastomeres segmented with graph-based method [33]. Segmented contours marked in
yellow. Bottom row: reconstructed 3D structure of blastomeres [33].
Page 32
15
2.2 Overview of Visual Servoing Techniques
2.2.1 Introduction to Visual Servoing
The physical cell surgery task is a necessary step to perform automated procedures on embryos.
Methods for manipulation and surgery of cells and intracellular components involve optical
tweezers [44], [45], electric fields [46], and friction-based rotation [11] for both translation and
rotation of the cell. Since the objective is to use the existing hardware from the microscope,
commonly found in IVF clinics, micropipettes are used as the tool for the automated cell surgery
task. For robotic micromanipulators to perform automated processes on embryos, the position of
the embryos with respect to the micromanipulator must be evaluated for controller development
[47]. The closed-loop control of a manipulator from visual images is also known as visual servoing
[47].
As defined by Chaumette and Hutchinson, โvisual servo control refers to the use of computer
vision data to control the motion of a robotโ [48]. It requires the use of a camera to acquire an
image, from which coordinates of objects are calculated, permitting the motion of a robot. The
camera may either be connected to the end-effector, or some other appendage, of the moving robot,
or it may remain stationary [48], [49]. In the case of automation of IVF tasks with brightfield
microscopes, the focal plane of the camera will only move along the ๐ง-axis, whereas the
micropipette will move in the ๐ฅ, ๐ฆ, and ๐ง axes.
The aim of visual servoing is to minimize the error between the current set of features, and the
desired set of features, as shown in (2.2) [48].
๐(๐) = ๐(๐(๐), ๐) โ ๐โ (2. 2)
Where ๐(๐ก) is the error between features, ๐ is the current set of features, ๐(๐ก) is the set of image
measurements, ๐ is the set of parameters representing additional knowledge about the system, and
๐โ is the desired set of features. There are two main visual servo control techniques: image-based
visual servo control (IBVS), and position-based visual servo control (PBVS). Both variations have
their pros and cons.
Page 33
16
2.2.2 Image-Based Visual Servo
IBVS focuses on manipulating pixel coordinates to achieve the desired coordinate values [48]. The
goal of IBVS is to solve (2.2) where ๐ is the pixel coordinate of each feature of the object of interest
from the cameraโs perspective, and ๐โ is the desired pixel coordinate of each feature. The goal of
IBVS is to move the camera, or the object, in a way that features at the initial position, move
towards the desired position.
IBVS will then move the camera, and/or object, such that the pixel coordinates of the features, ๐,
align with those in the desired state, ๐โ. IBVS is useful when using cameras to control robotic
manipulators, particularly at a larger scale. Liu and Sun perform IBVS on cells for tracking and
cellular rotation [50]. However, it is performed only along the ๐ฅ๐ฆ-plane and does not include the
๐ง-axis. At the cellular scale, due to the shallow DoFs of the microscope, it becomes increasing
difficult to implement IBVS since there are far more parameters at play, and thus is not ideal for
this research.
2.2.3 Position-Based Visual Servo
PBVS is another type of visual servoing. Instead of pixel coordinates, PBVS focuses on
manipulating the position coordinates of objects and manipulators to achieve the same goal as
IBVS; to solve (2.2), however with an exception. For PBVS, ๐ is the position coordinate of each
feature of the object of interest from the robotโs perspective, and ๐โ is the desired position
coordinate of each feature [48]. The concept is illustrated with the use of diagrams. The diagrams
in Figure 2.9 show (a), the initial state of a square shaped object with reference to the robot frame,
and (b) the desired state. The goal of PBVS is to move the camera or object in a manner that the
position coordinates of the features at the initial position, shown as the yellow dots of Figure 2.9(a),
move towards the desired position, shown as the red dots in Figure 2.9(b).
Page 34
17
Figure 2.9: Demonstration of IBVS: (a) Initial coordinates of features, marked as yellow dots, and
(b) Desired coordinates of features, marked as red dots.
Figure 2.10: Controlling position coordinates to move to desired location with PBVS.
Page 35
18
PBVS will then move the camera, and/or object, in a way so that the position coordinates of the
features, ๐, align with those in the desired state, ๐โ, as demonstrated in Figure 2.10. However,
PBVS can work without the features necessarily being in the field of view, with trajectory planning
as performed by Thuilot et al. [51].
In order to move the micropipettes to a targeted position, i.e. the centroid of the blastomere of
interest, ๐ถ๐, visual servoing is required. In this case, ๐โ = ๐ถ๐, as ๐ถ๐ is the desired position of the
micropipette. Since PBVS operates with a single 2D image, obtained from the microscope, due to
a limited depth of focus, and due to the necessity of determining the 3D coordinates of the centroid,
in 3D, it is far more ideal to incorporate PBVS as adopted in this research. The lack of PBVS used
as a vision-based control approach at the cellular scale also calls for an investigation into the
matter, and to see if PBVS is a viable method to use with embryos to perform IVF tasks.
Page 36
19
Methodology
In this chapter, the methodology and design procedure for the software algorithms to obtain the
centroid of the blastomere of interest are presented. In this chapter, Section 3.1 introduces the
concept of z-stack images, and how they are obtained with the lab equipment (Section 3.1.1).
Section 3.2 details the image processing algorithms required for blastomere segmentation,
including initializing the region of interest (Section 3.2.1), creating a low-cost energy path (Section
3.2.2), and calculating the centroid of the blastomere (Section 3.2.3) [52]. The last section, Section
3.3 introduces the methodology used to calibrate the micropipette (Section 3.3.1), and visual servo
the pipette to the target blastomere of interests position (Section 3.3.2).
3.1 z-Stack Images
The methodology adopted will utilize the concept of z-stack images to calculate the centroid, ๐ถ๐ =
(๐ฅ๐ , ๐ฆ๐ , ๐ง๐), in 3D Cartesian coordinates, where ๐ถ๐ is the true centroid position of the blastomere
of interest (BOI), and ๐ฅ๐ , ๐ฆ๐, and ๐ง๐ are the true centroid coordinate components along the axes
x, y, and ๐ง respectively, as shown in Figure 3.1. Z-stack images are a sequence of images captured
successively at equally spaced focal planes, as shown in Figure 3.2.
Figure 3.1: Diagram of showing axes of motion, and the Cartesian reference frame. The motorized
stage moves along the xy-plane. The objective lens moves along the z-axis.
Page 37
20
Z-stack images for experimental work in this research are acquired via an inverted brightfield
microscope. The z-stack image capturing experimental setup is shown in Figure 3.1. An embryo
is placed upon a motorized stage while submerged in embryo culture media. An objective lens of
a microscope is situated below the embryo. The motorized stage moves in the xy-plane, whereas
the objective lens moves along the z-axis. Utilizing the microscope system, Hoffman Modulation
Contrast (HMC) images, denoted as ๐ผ1, ๐ผ2, โฆ , ๐ผ๐, are captured successively at equally separated
focal planes with the objective lens. Individual images are called z-stack images, whereas the
collection of these z-stack images is called the image stack. Figure 3.2 illustrates the z-stack
images, and how they relate to the image stack, as well as spacing distance between each z-stack
image, ๐๐ผ, and notation for the ๐th z-stack image of interest (IOI), ๐ผ๐, which is outlined in red, later
used in Section 3.2.2. Figure 3.2 also includes the two z-stack images, ๐ผ๐โ1 and ๐ผ๐+1, outlined in
blue, which are to be involved in the process of creating an image array, ๐ฝ, also used in Section
3.2.2.
Figure 3.2: Diagram of Image Stack and z-Stack Images. The z-stack image outlined in red is the
z-stack image of interest (IOI). The two z-stack images outlined in blue are involved in the process
to create the image array, ๐ฝ, which is further explained in Section 3.2.2 below.
Page 38
21
The spacing between each focal plane, ๐๐ผ, is found by calculating Berekโs formula, as shown as
(2.1) [35]. Since every z-stack image, ๐ผ, has a limited depth of field (DoF) range, the spacing for
each z-stack image must be such that no blastomere information is lost. Overlapping the DoF of
the z-stack images are adequate, but underlapping removes necessary information. Using the
Berekโs formula to equate the proper DoF required for a 20x objective lens, used to acquire z-stack
images, ๐๐ผ is found to be approximately 5 ฮผm, and thus each z-stack image is separated 5 ฮผm apart.
3.1.1 Obtaining z-Stack Images
Via a Nikon Ti-U inverted microscope, HMC z-stack images are taken of the embryo. The first
image is taken with the focal plane located slightly below the bottom of the embryo. Using the
microscope software to drive the x-y-z microscope stage, Prior Proscan III, the objective lens
moves upwards, along the z-axis, and with the software, Micromanager and ImageJ, the camera
successively captures an image every 5 ฮผm, until it reaches 60 z-stack images, or 120 ฮผm, which
is slightly larger than the diameter of a mouse embryo, typically 100 ฮผm. This results to the last
image acquired slightly above the embryo, and ensures that data for the entire embryo is collected.
This z-stack image acquisition is shown as Figure 3.3(a). The images are collated into a TIF file;
a filetype that contains all images taken. Figure 3.3(b) shows a sample of the format of the TIF
file. Once the TIF file of the image stack is acquired, it is then exported into Matlab, which then
initiates the image processing of blastomere image segmentation process, detailed in Section 3.2.
Figure 3.3: Z-stack images of embryo. (a) Embryo with z-stack images taken successively at
equally separated focal planes. (b) Individual z-stack images stacked to indicate what part of the
blastomere is taken at what z-stack image. Also samples the format of TIF files.
Page 39
22
3.2 Blastomere Segmentation
A series of image processing steps are executed to calculate the centroid, ๐ถ๐ = (๐ฅ๐ , ๐ฆ๐,๐ง๐), of the
BOI. The proposed blastomere segmentation algorithm is comprised of three main steps. First, for
the initialization step (Step 1), an approximation of the centroid of the BOI, ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ), is
computed using a series of image processing algorithms. This is a prerequisite step for Step 2,
which requires an approximate centroid to create a region of interest (ROI). Step 2 utilizes this
ROI, converting the image to polar coordinates. Following a series of image processing steps, a
low-cost directed graph is generated to find and obtain the contour of the BOI. With the BOI
segmented for the ๐ผ๐ผ image, the 3D coordinates of the centroid are computed of this 2D image. The
z-coordinate of the BOI is obtained from the z-coordinate at which the z-stack image was obtained.
These steps are then completed for all z-stack images of the image stack. Lastly, the 3D centroid
of the BOI is calculated from the centroids of the BOI from each 2D image, as ๐ถฬ
= (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
). The
following sections, Section 3.2.1, Section 3.2.2, and Section 3.2.3, detail the Initialization, Low-
Cost Energy Path, and Blastomere Centroid Calculation image processing steps respectively.
3.2.1 Initialization (Step 1)
The first of the series of image processing steps is to initialize a region of interest (ROI) required
for the subsequent image processing steps. The low-cost energy path, as detailed in Section 3.2.2,
is an image processing procedure that segments the boundary of the BOI for a z-stack image. To
simplify the procedure, the area that contains this boundary is processed, rather than the entire
image. This area is also known as the ROI. Due to the approximately circular shape of the
blastomeres, the ROI bounds are also circular in shape, and should be centered in a way such that
it encompasses the BOI boundary on both the inner and outer sides. This centered position is
approximated as ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ), with the use of image processing procedures. The approximation
does not need to be exact. As long as the bounds of the ROI fully encompass the BOI boundary,
the low-cost energy path algorithm can perform the blastomere image segmentation.
To begin locating the centroid approximation, ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ), initial image processing steps are
performed. To illustrate this process, z-stack images of a 2-cell blastomere embryo is used,
selecting a z-stack image near the middle of the stack where the blastomere is the largest and
Page 40
23
clearest, labelled as ๐ผ๐ผ. First, a greyscale image of the embryo is used as an input to the image
processing software described in the following, as shown as Figure 3.4.
Figure 3.4: The original image of the embryo, selected from the middle of the image stack [15].
Next, a standard deviation filter is applied to the image, accentuating prominent lines of the image,
such as the blastomere boundaries. Other methods, such as Canny and Sobel operators were
experimented with. However, due to the complex nature of the inner sides of the blastomeres and
their unclear contrasts, they provide chaotic line segmentation within the blastomere, even after
varying their respective thresholds, which are difficult to remove with further image processing.
Hough transforms were also attempted to address this issue, however this approach is found to
work better for finding objects with either straight distinct lines, or with high degrees of circularity.
For these reasons, the standard deviation filter is adopted as an approach to accentuate the
necessary lines, such as the blastomere boundaries. The Matlab function, stdfilt, applies a local
3x3 standard deviation filter, which then accentuates the lines needed for further processing, as
shown as Figure 3.5.
Page 41
24
Figure 3.5: Image with standard deviation filter applied.
The standard deviation filter outputs an array with intensities ranging from 0 to 255. To create a
segmented image, a binary mask is required. The image therefore is filtered with a binary threshold
filter, which converts the pixel intensities to either 0 or 1, depending if they are greater than a given
threshold value, ๐น๐กโ๐๐๐ โ. In this case, ๐น๐กโ๐๐๐ โ is chosen empirically such that the blastomeres are
not fragmented, and much of the image remains intact. Note that the value of ๐น๐กโ๐๐๐ โ may vary
depending on parameters that change how the image stack is acquired, such as the exposure set by
the image capturing system. The Matlab function, imbinarize, then binarizes the standard
deviation filter image, with the threshold ๐น๐กโ๐๐๐ โ, as shown as Figure 3.6.
Page 42
25
Figure 3.6: The thresholded binarized image.
To eliminated unwanted noise from the thresholded image, and to only focus on the blastomeres,
an area filter is then applied. Matlabโs area filter function, bwareafilt, is applied to keep the
largest area, the embryo, and to remove image noise, as shown in Figure 3.7.
Figure 3.7: Image with area filter applied.
The resulting area is then hole-filled, with the Matlabโs imfill function. Hence a single blob of
the two blastomeres remain, as shown in Figure 3.8.
Page 43
26
Figure 3.8: Image with area fill applied.
To further smooth the image, and to eliminate extraneous artifacts, a structuring element is created
and applied. Using the Matlab strel function, a disk-shaped structuring element of size 3, is
applied over the image, which smooths the blob, shown in Figure 3.9. Erosion and dilation methods
may also be used, however this does not guarantee the removal of all unwanted anomalies.
Figure 3.9: Image smoothened by a structuring element, resulting in a blob containing the two
blastomeres.
The structuring element then leaves only the two blastomeres of the embryo. However, to separate
the two embryos, further image processing is required. Note that 2-cell stage embryos suspended
Page 44
27
in a stable orientation with blastomeres lie on a plane parallel to the camera focal plane [36], [37].
4-cell stage embryos have blastomeres oriented in a tetrahedral pattern, with two blastomeres on a
plane on the top, and two blastomeres on the plane below, as shown in Figure 2.3 [38].
Occasionally, the embryo will be oriented such that three blastomeres lie on the bottom plane, and
one lies on the top, forming a pyramid shape, however this scenario is unlikely [36], [37].
Knowing this orientation of the blastomeres within the embryo, image processing algorithms can
be applied to split the blob into the two blastomeres, and then calculate to resulting approximate
centroid, ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ).
There exist a few methods for acquiring approximate centroids for the ROI. One method performed
by Giusti et al. involves the use of distance transforms [53]. Distance transforms utilize the
approach of calculating the distance from every pixel with intensity 1 (white pixels), to the closest
pixel with intensity 0 (black pixels), and then defining a new image with pixels intensities based
on that distance. This leads to images with local maxima, as these portray the position where the
furthest from the boundary. Giusti et al. was able to use the distance transform on images of human
zygotes for approximating the centroids of them [53]. However, this was performed for single-
celled zygotes. The method was investigated experimentally using 2-cell stage embryos to find
that the method results in inaccuracies due to the presence of several blastomeres in the image.
The method may be ideal for approximating singular circular shapes, such as zygotes, however
not multiple blastomeres. Note that there are several approaches to obtain the centroid
approximation, and the following method is only one of them. If a viable centroid approximation
is obtained, using any method, the following steps, as detailed in Section 3.2.2 and Section 3.2.3,
may proceed. The proposed method is to first separate the two blastomeres in the image, and then
calculate the centroids of the two blastomeres.
The first step is to separate the blob into the two blastomeres. This must be performed at the line
at which the two blastomeres come in contact. The idea is to create a line that divides the two
blastomeres along the shortest width of the blob. To do so, first the centroid of the blob, ๐ถ๐๐๐๐,
from Figure 3.9, is calculated using Matlabโs regionprops function, as shown in Figure 3.10(b).
Next, the shortest distance from ๐ถ๐๐๐๐ to the edge is calculated, and a straight line is drawn to
closest edge, as shown as the pink line in Figure 3.10(c). A 1-pixel width, 0 intensity line is then
drawn from the edge, through ๐ถ๐๐๐๐, to until it hits another edge. This marks the shortest width of
Page 45
28
the blob, and should be the ideal place to separate the blastomeres in two. The line then follows a
multiplication operation with the image to remove the part of the blob at the location of the line,
shown as Figure 3.10(d). The resulting image then shows two separate blobs, which are the
separated blastomeres. Like before, another regionprops function is performed to obtain the
centroids of the two blastomeres, ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ), and label them as ๐ถ๐1= (๐๐๐ฅ1
, ๐๐๐ฆ1) and ๐ถ๐2
=
(๐๐๐ฅ2, ๐๐๐ฆ2
) respectively. The example followed in this methodology will observe the second
blastomere, labeled with centroid approximation ๐ถ๐2, as the BOI.
Figure 3.10: Image processing algorithms to acquire approximate centroids. (a) Blob acquired
from previous step Figure 3.9. (b) Calculated centroid of blob represented as a blue *. (c) Line
from blob centroid to closest edge. (d) Segmented blastomeres from line cut. (e) Centroids of
respective blastomeres, with the centroids represented as a blue *.
Page 46
29
3.2.2 Low-Cost Energy Path (Step 2)
The second of the series of image processing steps is to segment the blastomere z-stack IOI, ๐ผ๐. To
do so, given the centroid approximation, ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ), from Section 3.2.1, an ROI is created
that encompassed the boundaries of the blastomere at ๐ผ๐. To create the ROI, a circular-shaped
corona, centered at the centroid approximation, ๐ถ๐, is formed. The size must be chosen such that
the corona fully encompasses the blastomere boundary on both the inside and outside. The inner
ring of the corona, ๐โฒ, must be smaller than the boundary, and the outer ring, ๐โฒโฒ, must be larger.
The size of these rings must account for the blastomereโs shape, and the accuracy of the centroid
approximation. For the purposes of this example, the radii, ๐, of these rings are set to ๐โฒ = 50
pixels, and ๐โฒโฒ = 100 pixels respectively, as can be seen in Figure 3.11.
Figure 3.11: z-Stack image with ROI around BOI.
Page 47
30
Instead of utilizing the sole z-stack IOI, ๐ผ๐, three z-stack images from the image stack are used for
the low-cost energy path algorithm: the z-stack IOI (๐ผ๐), one z-stack image above (๐ผ๐+1), and one
z-stack image below (๐ผ๐โ1) the z-stack IOI. This is to account for the z-direction gradient changes
to the dataset while determining the low-cost energy path. More z-stack images from the image
stack may be used for the algorithm, but at a tradeoff of a linear increase in computation time,
which is further explained below. The resulting ROI is then converted from Cartesian coordinates
to polar coordinates through bilinear interpolation using (3.1), as shown in Figure 3.12.
๐ฑ(๐ฝ, ๐, ๐) = ๐ฐ(๐๐๐ + ๐ โ ๐๐จ๐ฌ(๐ฝ) , ๐๐๐ + ๐ โ ๐ฌ๐ข๐ง(๐ฝ) , ๐) (3. 1)
0 โค ๐ โค 2๐ ๐โฒ โค ๐ โค ๐โฒโฒ
Figure 3.12: ROI displayed in polar coordinates at the z-stack IOI, ๐ฝ๐.
๐ฝ is the ๐๐ x ๐๐ x 3 image array in cylindrical polar coordinates. ๐ and ๐ are uniformly sampled in
๐๐ and ๐๐ intervals, respectively. Ideally, the smaller values chosen for ๐ and ๐, the less
computationally expensive the image processing procedure will be. Consequently, the larger the
numbers chosen for ๐ and ๐, the more computationally expensive and intense the image processing
procedure. The data format of the ๐ฝ image array is as shown in Figure 3.13. ๐ฝ๐ is the image array at
index ๐. With ๐ being the index of the image array, analogous to how ๐ represents the index of the
IOI. Since three images are converted into polar coordinates, the corresponding transformation of
the image ๐ผ๐โ1, ๐ผ๐, and ๐ผ๐+1, are converted, with a bilinear interpolation, into ๐ฝ๐โ1, ๐ฝ๐, and ๐ฝ๐+1
respectively.
Page 48
31
Figure 3.13: Format of image array, ๐ฝ.
In order to calculate the graph-based, low-cost energy path, image array ๐ฝ is processed to acquire
energy values for its respective pixels. The energy value at every pixel is defined by Giusti et al.
as (3.2) and (3.3) [53].
๐ธ(๐, ๐) = ๐(cos(๐ โ ๐ผ) โ ๐บ๐(๐ฝ) + sin2(๐ โ ๐ผ) โ |๐บ๐(๐ฝ)|) (3. 2)
๐(๐ฅ) = (1 + ๐๐ฅ๐พ)
โ1
(3. 3)
The variable ๐บ๐ represents the gradient operator along the radial direction, ๐. The sigmoid
function, ๐(๐ฅ), is applied to make all values of the energy array ๐ธ(๐, ๐), to be in the range [0, 1].
The scaling factor, ๐พ, is set to 1/5 of the image array dynamic range. ๐ผ is the direction of the
lighting resulting from the use of HMC imaging. Processing image array ๐ฝ with (3.2) results in the
energy array ๐ธ(๐, ๐). The 3D adaptation of (3.2), involving the z-stack images surrounding the
IOI, is adapted to include these z-stack images, and incorporates the ๐ฝ array with parameters, ๐, ๐,
and ๐, and is shown as (3.4). The left component dominates where the contour is orthogonal to the
Page 49
32
light direction, and the right component takes into account for unpredictability of the contour
appearance when the contour is parallel to the light direction [53].
๐ธ(๐, ๐, ๐) = ๐(cos(๐ โ ๐ผ) โ ๐บ๐(๐ฝ) + sin2(๐ โ ๐ผ) โ |๐บ๐(๐ฝ)|) (3. 4)
An example of the energy array at the z-stack image, ๐ธ๐, is shown as Figure 3.14.
Figure 3.14: Energy array at the z-stack image of interest, ๐ธ๐.
After the energy array is created, a directed graph structure is then constructed from each pixel,
represented as a node, in order to generate a low-cost energy path. A directed graph is a set of
vertices, or nodes, that are connected to each other, and these connection have a direction [54]. A
directed edge, also known as an arc, is an ordered pair, and is labelled as (๐, ๐) for example, where
๐ is the node it starts at, and ๐ is the node it is directed towards. A value is associated for each arc,
which in this case will be the pixel values of the energy array, ๐ธ. Nodes may have as many arcs
both directed towards, and away from them as needed. The locations of the nodes do not matter.
What matters most is which nodes connect to one another. Figure 3.15 shows a basic graph, where
node ๐ has three directed paths to nodes ๐, ๐, and ๐, with their respective energy values being ๐ธ๐,
๐ธ๐, and ๐ธ๐.
Page 50
33
(๐, ๐) ๐ธ๐
(๐, ๐) ๐ธ๐
(๐, ๐) ๐ธ๐
Figure 3.15: Basic graph structure example.
Paths are routes taken to travel from one node to another, even if they are not connected directly.
The path entirely depends on the problem it is trying to achieve. For example, the Travelling
Salesman Problem is a problem where all nodes are connected to each other, also known as a
Hamilton circuit, and have different weights. The goal for the Travelling Salesman Problem is to
start at one node, travel to all other nodes, all in the shortest distance possible. This problem is
used to simulate path planning algorithms, such as those used by global positioning systems (GPS).
There are many algorithms that can solve the Travelling Salesman Problem, but the main issue is
the complexity of the graph. A brute force algorithm will absolutely find the shortest path, but will
take the longest time since it needs to analyze the energy values between all nodes to determine
the shortest path. A more directed algorithm, such as nearest-neighbour algorithm, does not need
to analyze the energy values for all the nodes, but rather only arcs with the smallest energy values.
Depending on the scenario, multiple algorithms may be used to solve the same equation.
Since this methodology does not have a Hamilton circuit like graph, where all the nodes are
connected to each other, other algorithms must be employed. The brute force approach may be
used, but will require far too much computation, especially with an increase in graph complexity.
The algorithm implemented in this methodology uses the low-cost path algorithm. The low-cost
path algorithm starts from the node with the lowest energy value from the rightmost column and
observes all the nodes connecting to it. Of those connecting nodes, it selects the node that has the
Page 51
34
smallest energy value. The journey continues until it encounters the leftmost node. The path taken
for this journey is stored. An example of this graph and algorithm is illustrated in Figure 3.16. The
nodes ๐ โ ๐, are connected to each other and have energy values, as shown in the figure and arcs
beside. The node with the lowest energy value on the righthand column is ๐, which has an energy
value of 0.1. Of the nodes connected to it, ๐ and ๐, the node with the smallest energy value is ๐,
with a value of 0.3. Even though ๐, has a smaller value of 0.2 within the same column, it is not
connected to node ๐. Lastly, the only node connected to ๐ is ๐, which has an energy value of 0.4.
The low-cost path generated, is then as follows; ๐ โ ๐, and then ๐ โ ๐, as outlined in orange in
Figure 3.16.
(๐, ๐) 0.5
(๐, ๐) 0.3
(๐, ๐) 0.2
(๐, ๐) 0.1
(๐, ๐) 0.2
(๐, ๐) 0.1
(๐, ๐) 0.2
(๐, ๐) 0.4
(๐, ๐) 0.2
(๐, ๐) 0.4
Figure 3.16: Basic graph structure path example.
For the methodology, from the energy array ๐ธ, a directed graph structure is constructed from each
pixel, represented as a node, to generate a low-cost energy path. In this graph, each node connects
to its forward-facing, 26-neighbour pixels, up to a maximum of nine. The arcs are forward-facing
is due to the nature of the path. Due to the round nature of the blastomeres, segmentation is to
occur in a sequential, left to right, way in the polar coordinate frame of reference. This way, the
path always travels in one direction and does not stagnate during its search. Also, the reason why
Page 52
35
only the closest neighbouring nodes are connected to each other is due to two reasons. The first
being, having more nodes connected linearly adds to the complexity of the graph. And the second
being, the blastomere does not vary too much in shape to rationalize connecting nodes to more
than their closest neighbouring nodes. Figure 3.17 shows how the node connect to one another on
one z-stack image, ๐ธ๐.
Figure 3.17: Sample of 2D graph structure of the energy z-stack image, ๐ธ๐.
For this sample, the node, represented by the bright red square at (๐ธ๐, ๐ธ๐ , ๐ธ๐) is taken as example.
On this plane, the node connects to its forward-facing, 8-neighbouring nodes, represented by the
green squares. Note that for the 2D example, this means that the bright red node only connects to
three other nodes. This is true for all nodes placed within the inner sides of the columns; nodes
within ๐2 and ๐๐โ1. The outer sides of the columns, i.e. ๐1 and ๐๐, each have two forward-facing
nodes. To expand on the graph structure, a 3D graph structure is made with the introduction of the
images surrounding the z-stack images, ๐ธ๐โ1 and ๐ธ๐+1, as shown in Figure 3.18.
Page 53
36
Figure 3.18: Sample of 3D graph structure, ๐ธ(๐๐, ๐๐, ๐).
For this sample, the node, represented by the bright red square at (๐ธ๐, ๐ธ๐ , ๐ธ๐) is taken as an example.
For this graph structure, the note connects to its forward-facing, 26-neighbouring nodes,
represented by the green squares. For the 3D example, this means that the bright red node connects
to not only the three nodes in front of it (๐ธ๐+1, ๐ธ๐โ1 โ ๐ธ๐+1, ๐ธ๐), but also the three nodes above
(๐ธ๐+1, ๐ธ๐โ1 โ ๐ธ๐+1, ๐ธ๐+1) and below (๐ธ๐+1, ๐ธ๐โ1 โ ๐ธ๐+1, ๐ธ๐โ1) it as well, for a total of nine
connections. This is true for all nodes placed within the inner sides of the columns on ๐ธ๐; nodes
within ๐2 and ๐๐โ1. The outer sides of the columns on ๐ธ๐, i.e. ๐1 and ๐๐, each have six forward-
facing nodes. The nodes placed within the inner sides of the columns of ๐ธ๐โ1 and ๐ธ๐+1, nodes
within ๐2 and ๐๐โ1, also each have six forward-facing nodes. And the nodes placed on the outer
sides of the columns on ๐ธ๐โ1 and ๐ธ๐+1, nodes ๐1 and ๐๐, each have four forward-facing nodes. To
illustrate this complex structure, Figure 3.19 illustrates these four components highlighted in
multiple colours.
Page 54
37
Figure 3.19: Components of 3D Graph Structure Complexity.
The illustration provides a general case where an ๐ number of z-stack images are used, rather than
three. The reason why three is selected is due to complex nature of the 3D graph structure. The
equation of calculating the number of permutations possible, ๐๐, for a graph structure of size ๐๐ x
๐๐ x ๐ is shown below as (3.5), as well as in Figure 3.19.
๐๐ = (๐๐ โ 2)(9)(๐๐ โ 1)(๐ โ 2) + (2)(6)(๐๐ โ 1)(๐ โ 2)
+2๐๐ โ 26๐๐ โ 1 + 224๐๐ โ 1 (3. 5)
Page 55
38
The more ๐ number of z-stack images there exists within the energy array, the higher the
complexity. The complexity increases linearly with an increasing number of ๐ z-stack images
within the energy array. Using the equation (3.5), ๐๐ is found to be 206,164 for ๐ = 3. If ๐ = 1,
๐๐ is found to be 29,452. A graph is shown in Figure 3.20, showing the ๐๐ for various values of
๐. If more than three z-stack images are used, then they must be chosen symmetrically from ๐ธ๐
as ๐ธ๐ยฑ๐, where ๐ โ โ+, which is also why ๐ is always an odd number.
Figure 3.20: Graph showing number of permutations, ๐๐, vs. number of z-stack images within
the graph for the energy matrix, ๐ธ.
The most computationally expensive part of the overall methodology is creating the 3D graph
structure of the energy array. In Matlab, to create a graph structure, a sparse matrix is used. A
sparse matrix is a square matrix that contains and keeps track of all node and each of their
connections. For example, an energy array with lengths 200 x 50 x 3, can have a large sparse
matrix with an 29253 x 29253 array, with over 855 x 106 array indices. A sparse matrix, as depicted
before as a 29253 square matrix, is shown in Figure 3.21. Due to this large array, the majority of
the computation time is lost to create this sparse matrix for every z-stack image ๐ผ๐ of the image
stack.
0
200000
400000
600000
800000
1000000
1200000
1400000
1600000
1800000
2000000
0 5 10 15 20 25
Pn
m
Number of permutations depending on number of z-stack images
Page 56
39
Figure 3.21: Sparse Matrix where ๐ = 1, or ๐ = 3.
After generating the sparse matrix, a path is computed using the graphshortestpath function
from Matlab, and found to be the 3D boundary of the BOI at ๐ผ๐, denoted as ๐ค๐. A projection of the
3D path onto the xy-plane is illustrated in Figure 3.22. The path is then converted back into
Cartesian coordinates from polar coordinates, in order to complete the desired boundary for the z-
stack image, using a bilinear interpolation equation analogous to (3.1). The centroid of the resulting
shape in the ๐th z-stack image is then calculated from Matlab as ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฅ), as shown in Figure
3.23. This step is then computed for all ๐ z-stack images of the image stack. Since the blastomere
is stationary during this automated step, the same centroid approximation, ๐ถ๐, (from Section 3.2.1)
can be used to initialize this step. The radii ๐โ and ๐โโ change depending on which z-stack image
of the image stack is processed.
Page 57
40
Figure 3.22: Low-cost energy path. (a) Path ๐ค๐ at ๐ธ๐โ1. (b) Path ๐ค๐ at ๐ธ๐. (c) Path ๐ค๐ at ๐ธ๐+1. (d)
Path ๐ค๐ projected onto the xy-plane, ๐พ๐.
Figure 3.23: Computed path, ๐พ๐, represented by the red line. And centroid, ๐ถ๐, represented by a
red *, of the z-stack IOI of ๐ผ๐.
Page 58
41
3.2.3 Blastomere Centroid Calculation (Step 3)
With all centroids of each resulting z-stack image calculated, the three-dimensional centroid of the
entire blastomere, ๐ถฬ
= (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
), can be calculated. This is completed with a weighted-averaging
process, with each dimension shown in the following equations, (3.6), (3.7), and (3.8).
๏ฟฝฬ
๏ฟฝ =1
๐โ ๐๐๐ฅ
๐
๐=1
(3. 6)
๏ฟฝฬ
๏ฟฝ =1
๐โ ๐๐๐ฆ
๐
๐=1
(3. 7)
๐งฬ
=โ (๐ง๐ โ
๐ด๐)
๐๐=1
โ (๐ด๐)๐๐=1
(3. 8)
Where, ๐ is the index, ๐ง๐ is the z-coordinate at the z-stack image ๐ผ๐, and ๐ด๐ is the area contained
within the boundary of path ๐พ๐ of the segmented blastomere at ๐ผ๐.
The resulting centroid, ๐ถฬ
= (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
), is the calculated centroid of the blastomere. In Figure 3.24,
a diagram is shown of the z-stack images and their centroids, ๐ถ๐, areas, ๐ด๐, as well as the centroid
of the blastomere, ๐ถฬ
. With this, successive blastomere-related tasks can be performed, such as
blastomere aspiration with a micropipette [55], and optimizing the laser zona drilling ablation zone
[56]. The entire image processing procedure of Section 3.2 is illustrated as a flowchart, shown in
Figure 3.25.
Page 59
42
Figure 3.24: Diagram of the z-stack image centroids, ๐ถ๐, area ๐ด๐, and computed blastomere
centroid, ๐ถฬ
.
Page 60
43
Figure 3.25: Flowchart of Blastomere Segmentation Algorithm. The orange section represents the
manual operations required to begin the automated task, whereas the blue sections represent the
automated tasks. Statements in green represent the output for its respective step.
Page 61
44
3.3 Visual Servoing
Once the centroid of the blastomere of interest (BOI), ๐ถฬ
= (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
), is computed, the next step is
to move a micropipette tip to this target position. This procedure is performed with visual servoing.
More specifically with position-based visual servo (PBVS), since 3D coordinate data is utilized
rather than image data. Depending on the goal, different micropipettes shapes are used for different
tasks. For example, for blastomere aspiration using a displacement method, a micropipette with a
small outer diameter, is used to push fluid from within the embryo, and aspirate one or more
blastomeres [14]. A holding pipette has an outer diameter to be able to immobilize an embryo
during a procedure. A partial zona dissection (PZD) micropipette is used for dissection the embryo,
or perforating the zona pellucida [57].
In order to move the micropipette to the calculated blastomeres centroid, ๐ถฬ
, visual servoing,
specifically PBVS, is performed. The visual servoing discussed in the following two sections.
Section 3.3.1 describes the image processing procedure for micropipette calibration. Section 3.3.2
describes the micropipette control used to move the micropipette to the target position, ๐ถฬ
, using
PBVS.
The experimental setup is as follows, as shown in Figure 3.26. An embryo is submerged in small
layer of culture media inside a petri dish. The petri dish is placed on the brightfield microscope
motorized stage, which has 2 degrees of freedom (DOF) along the xy Cartesian plane. The camera
is positioned below the microscope objective lens. Note that when the objective lens moves
vertically in the z-axis direction, the focal plane moves accordingly. Two micromanipulators are
situated on either side of the embryo. On one micromanipulator, with the aid of a vacuum pump,
a holding pipette is positioned such that it holds the embryo with negative pressure, to keep the
embryo from moving during an experimental procedure. On the other micromanipulator, a solid
sharp tip PZD micropipette is held, angled towards the embryo. Both micromanipulators each have
3 DOF, allowing motion in the x, y, and z Cartesian axes. The stage, both micromanipulators, and
camera are connected to a computer, and are software controlled.
Page 62
45
Figure 3.26: Schematic of the experimental setup.
3.3.1 Micropipette Calibration
Once the experimental equipment is setup, the first step is to calibrate the PZD micropipette
position in Cartesian space. The micropipette must be within the image workspace for proper
calibration (Figure 3.27(a)). To calibrate, first the micropipette is segmented with image
processing, similar to the work with Wong and Mills [55]. First, the Canny edge detection method
is used with the Matlab edge function, as shown in Figure 3.27(b). Due to the simple, straight
shape of the micropipette, the Canny edge detection algorithm readily produces well-defined lines.
Since the micropipette extends outside the borders of the image, the image cannot be closed, and
hence cannot be filled. To address this problem, the borders are considered to be edges, and then
are filled with the Matlab imfill function [55], as shown in Figure 3.27(c). Then, since only the
outline of the micropipette is required, the artificial edge on the border is removed and the Matlab
bwperim function is used to solely find this outline. Then, a set of points, labeled ๐ ๐๐๐๐๐๐๐(๐), ๐ โ
[1, ๐], where ๐ is the total number of points sequentially ordered from start to end, and ๐ is the
index [55], as shown in Figure 3.27(d). The angle of the micropipette, ๐ผ, is used to find the positon
of the tip. A line running through the median of the micropipette will encounter a point on the
micropipette. The point at which the encounter occurs is the tip. An initial estimate of the angle,
๐ผ๐๐๐๐ก, is made by averaging the slopes of the micropipette walls, ๐ผ๐๐๐๐ก,๐ฟ and ๐ผ๐๐๐๐ก,๐
as ๐ผ๐๐๐๐ก, derived
by (3.9 โ 3.11). ํ is the number of points used for estimating the angle, chosen such that it does
Page 63
46
not include the tip, and varies depending on the micropipette type. The left and right sides of the
micropipette are named relative to which side of the tip it sits on.
๐ผ๐๐๐๐ก,๐ฟ = (tanโ1(๐ ๐๐๐๐๐๐๐,๐ฆ(1)) โ (๐ ๐๐๐๐๐๐๐,๐ฆ(1 + ํ))
(๐ ๐๐๐๐๐๐๐,๐ฅ(1)) โ (๐ ๐๐๐๐๐๐๐,๐ฅ(1 + ํ))) (3.9)
๐ผ๐๐๐๐ก,๐
= (tanโ1(๐ ๐๐๐๐๐๐๐,๐ฆ(๐)) โ (๐ ๐๐๐๐๐๐๐,๐ฆ(๐ โ ํ))
(๐ ๐๐๐๐๐๐๐,๐ฅ(๐)) โ (๐ ๐๐๐๐๐๐๐,๐ฅ(๐ โ ํ))) (3.10)
๐ผ๐๐๐๐ก =๐ผ๐๐๐๐ก,๐ฟ + ๐ผ๐๐๐๐ก,๐
2(3.11)
To find the angle of the micropipette, the side walls of the micropipette are used. Local angles ๐ผ๐
at every point ๐ ๐๐๐๐๐๐๐(๐) are then averaged to find ๐ผ, using (3.12), as derived by Wong and Mills
[55].
๐ถ๐ = ๐ญ๐๐งโ๐(๐๐๐๐
๐๐๐๐
,๐(๐ + ๐น)) โ (๐๐๐๐
๐๐๐๐
,๐(๐ โ ๐น))
(๐๐๐๐
๐๐๐๐
,๐(๐ + ๐น)) โ (๐๐๐๐
๐๐๐๐
,๐(๐ โ ๐น)); ๐ โ [๐ + ๐น, ๐ต โ ๐น] (๐. ๐๐)
Where, ๐ฟ is the sampling length. The points that do not closely align with the micropipette walls,
such as the tip, are removed with a simple threshold criterion where only the points within 15ยฐ of
๐ผ๐๐๐๐ก are removed. The two remaining, now separated, set of points on the two sides are labelled
as ๐ ๐ฟ and ๐ ๐
. The centroids of these sides are found with the Matlab regionprops function and
are labelled ๐ถ๐ฟ and ๐ถ๐
, respectively. A line is then drawn through the midpoint of these two
centroids, with angle ๐ผ [55]. The point at which this line touch the encounters the micropipette, is
the position of the tip of the micropipette, ๐, as shown in Figure 3.27(e).
Page 64
47
Figure 3.27: Micropipette image segmentation. (a) Original image of micropipette. (b) Canny edge
detection. (c) Image Fill. (d) Micropipette outline split into side walls and tip. (e) Micropipette
with orientation and tip position.
Page 65
48
Once the tip of the micropipette is found through these image processing techniques, the next step
is to calibrate the micropipette position in Cartesian space. To do so, first the micropipette is placed
at an arbitrary position within the camera workspace such that the micropipette is within the focal
plane of the camera. The tip position, ๐1, is then found from the earlier image processing
procedure, as shown in Figure 3.28. With the micromanipulator, the micropipette moves along the
xy-plane to a known position within the camera workspace, with the same image processing
procedure applied again, extracting the tip position, ๐2, as shown in Figure 3.28. Knowing the
position of these points ๐1 and ๐2 in the camera frame of reference as ๐1๐ถ and ๐2
๐ถ, a coordinate
transform is applied in order to obtain the position in the micromanipulator frame of reference,
๐1๐ and ๐2
๐ [55]. To obtain the coordinate transform, first the angles between the two points are
found for the rotation matrix, found in both camera and manipulator coordinate frames as ๐ฝ๐ถ and
๐ฝ๐ as (3.13 โ 3.14).
๐ฝ๐ถ = tanโ1 (๐2,๐ฆ
๐ถ โ ๐1,๐ฆ๐ถ
๐2,๐ฅ๐ถ โ ๐1,๐ฅ
๐ถ) (3.13)
๐ฝ๐ = tanโ1 (๐2,๐ฆ
๐ โ ๐1,๐ฆ๐
๐2,๐ฅ๐ โ ๐1,๐ฅ
๐) (3.14)
The rotation matrix, ๐
is then found. as follows in (3.15).
๐
= [cos(๐ฝ๐ โ ๐ฝ๐ถ) โ sin(๐ฝ๐ โ ๐ฝ๐ถ)
sin(๐ฝ๐ โ ๐ฝ๐ถ) cos(๐ฝ๐ โ ๐ฝ๐ถ)] (3.15)
The transformation, ๐(๐๐ถ) is found to be as follows (3.16) [55].
๐๐ด = ๐ป(๐๐ช) = ๐๐น๐๐ช + ๐๐๐๐๐๐๐ (๐. ๐๐)
Where ๐ is the scaling factor in ฮผm/px, based on the camera and magnification used, and ๐๐๐๐๐ ๐๐ก
is the offset from the micromanipulator frame origin to the camera frame origin. Once the camera
position in Cartesian space is calibrated, the micropipette moves to a third arbitrary, but known,
position, ๐1, to verify the micropipette calibration. This calibration process is illustrated in Figure
3.28(c).
Page 66
49
Figure 3.28: Micropipette Calibration Procedure. (a) Micropipette at first position. (b)
Micropipette at second position. (c) Micropipette at calibration test position.
Calibration of the z-coordinate is performed in a similar, but simpler manner. The scaling factor
used for both the camera and micromanipulator frames of reference is the same, i.e. ๐ = 1, since
they both use the units of ฮผm. The rotation matrix, ๐
= 1 for the z-axis. Therefore, the z-coordinate
transformation is as follows in (3.17).
๐ง๐ = ๐ง๐ถ + ๐ง๐๐๐๐ ๐๐ก (3.17)
Where, ๐ง๐ is the z-coordinate from the micromanipulator frame of reference, and ๐ง๐ถ is the z-
coordinate from the camera frame of reference. ๐ง๐๐๐๐ ๐๐ก is the distance between the two coordinates
and can be found by subtracting ๐ง๐ถ from ๐ง๐. At this point, the inputs of the coordinates from the
camera frame of reference can be provided, and the micropipette will approach the targeted
coordinate.
3.3.2 Position Based Visual Servoing
Once calibrated, the final step is to move the micropipette to the calculated centroid of the
blastomere, ๐ถฬ
= (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
). With the holding micropipette, the embryo is then moved towards the
cameraโs field of view. At this point, the blastomere image segmentation algorithm is performed
and provides ๐ถฬ
. Since the micropipette can only move via translation, and not rotation, a specific
set of instructions are required for the micropipette to go towards the 3D coordinate, as the only
way to pierce the zona pellucida is to move the micropipette towards the embryo, with the PZD
Page 67
50
held orthogonal to the surface of the embryo. Since the micropipette cannot rotate, the zona
pellucida piercing must occur along the x-axis.
The micropipette travels along through four intermediate positions. The first position being the
PZD initial position. This coordinate is stored as ๐๐๐,1 = (๐ฅ1, ๐ฆ1, ๐ง1), i.e. the initial position of the
micropipette. The target coordinate is ๐๐๐,4 = (๐ฅ2, ๐ฆ2, ๐ง2). To move through these intermediate
positions, position-based visual servo (PBVS) is used to accomplish this task. Since PBVS allows
motion outside of the focal plane of the camera, the micropipette can move in three dimensions
(๐ฅ, ๐ฆ, ๐ง), while the focal plane of the camera remains stationary. The position coordinate ๐ is that
of the micropipette tip, and the desired position coordinate, ๐โ, is that of the computed BOI
centroid, ๐ถฬ
. The algorithm uses closed-loop control for controlling the micromanipulators.
Although, one control system cycle is run when moving the pipette from point-to-point. As speeds
of the visual feedback system increase in the near future, higher frequency cycles can be
implemented. The error is determined between the current position, and the provided target
position. The intermediate positions are provided to the software, given the blastomere centroid
position, ๐ถฬ
. Next, the micropipette moves along the z-axis to approach the same focal plane as the
blastomere centroid. This coordinate is ๐๐๐,2 = (๐ฅ1, ๐ฆ1, ๐ง2); the second position of the
micropipette, also shown in Figure 3.29(a). The next step is for the micropipette to move along the
y-axis, to approach the third coordinate, ๐๐๐,3 = (๐ฅ1, ๐ฆ2, ๐ง2), also shown in Figure 3.29(b). The
final step is for the micropipette to move towards the blastomere centroid, or ๐๐๐,4 = (๐ฅ2, ๐ฆ2, ๐ง2),
along the x-axis, as shown in Figure 3.29(c). This will sequentially pierce the zona pellucida, and
then move the micropipette to the centroid of the blastomere. A simplified explanation is given
below, with experiment performed in Appendix A.
First translation:
๐๐๐,1 = (๐ฅ1, ๐ฆ1, ๐ง1) โ ๐๐๐,2 = (๐ฅ2, ๐ฆ2, ๐งฬ
)
Second translation:
๐๐๐,2 = (๐ฅ2, ๐ฆ2, ๐งฬ
) โ ๐๐๐,3 = (๐ฅ2, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
)
Third translation:
๐๐๐,3 = (๐ฅ2, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
) โ ๐๐๐,4 = (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
) = ๐ถฬ
Page 68
51
Figure 3.29: Micropipette Control Path. (a) Micropipette at second position, ๐๐๐,2. (b)
Micropipette at third position, ๐๐๐,2. (c) Micropipette at fourth, and final, position, ๐๐๐,4.
Page 69
52
To summarize, Chapter 3 describes the methodology used for the blastomere segmentation and
visual servoing process. The first section of this chapter introduces z-stack images (Section 3.1),
and how they are obtained (Section 3.1.1). The next section describes blastomere image
segmentation (Section 3.2), and describes how it is performed involving a three step process,
including initializing a region of interest (Section 3.2.1), creating a graph structure to solve a low-
cost energy path (Section 3.2.2), and then calculating the centroid of the blastomere (Section
3.2.3). Finally, the chapter concludes with visual servoing (Section 3.3), first a micropipette is
calibrated with image processing (Section 3.3.1), and then position-based visual servo is used to
move the micropipette to a target position (Section 3.3.2), such as the blastomere centroid.
Page 70
53
Results and Discussion
In this chapter, the experimental procedures and results from the algorithms developed in this work
are presented. In Section 4.1, the process to obtain the centroid of the blastomere and to move a
PZD micropipette to the centroid location is described from a userโs standpoint. This is to both
describe the entire blastomere segmentation and visual servo procedure, as well as discuss in detail
the automated procedures. Section 4.2 provides results from sample experiments, how they are
obtained, and the effectiveness of the developed algorithms to accomplish the objectives. Section
4.3 discusses the results and clarifies causes of error within the algorithms.
4.1 Experimental Procedure
The entire BOI centroid computation and visual servo process involves several steps. In order for
a user to correctly perform the task required , the following steps are carried out sequentially, as
outlined in Figure 4.1. The first step is to prepare the embryo, along with the microscope. This step
is performed manually by the user, and is illustrated in Figure 1.2. The standard inverted brightfield
microscope used for experiments is the Nikon Ti-U, equipped with two objective lenses, with 20x
and 40x optical zoom respectively. For this setup, a 2-cell stage mouse embryo is submerged
within culture media on a petri dish and placed upon the stage of the microscope, a Prior Proscan
III, with translation actuation along the xy-axes. Two robotic micromanipulators, Scientific
Patchstar Micromanipulators, are placed on either side of the stage and are each equipped with a
micropipette. These micromanipulators have translation actuation along the xyz-axes, with the xy-
axes aligned with those of the motorized stage and camera frame. The left-hand micromanipulator
is equipped with the holding micropipette, with an outer diameter of 120 ฮผm and an inner diameter
of 15-20 ฮผm, capable of holding onto an early-stage mouse embryo of approximately 100 ฮผm
diameter. The right-hand micromanipulator is equipped with a solid sharp tip PZD micropipette,
capable of moving to a targeted position and piercing a targeted blastomere. The range of the PZD
micropipette is limited to the range of the micromanipulator, at 5 cm along the x, y, and z
directions. However, since the micropipette cannot move through the stage without structural
failure, care must be taken so the range of the micropipette moves only within approximately 2.5
cm above the stage. A standard QImaging optiMOS camera is attached to the microscope, via the
objective lens. All microscope components, including the stage, micromanipulators, and camera,
Page 71
54
are connected to a desktop computer equipped with an Intel Core i7-4790 CPU at 3.6 GHz and 16
GB of RAM. Each component has corresponding software associated with it; the stage and
objective lens run on Prior Scientific Controller, the micromanipulators run on Scientific Patchstar
LinLab 2, and the camera runs on Micromanager 1.4 and ImageJ. The developed algorithms are
programmed, and executed, in Matlab R2018a. A Sutter Instrument XenoWorks Digital
Microinjecter serves as a vacuum for the holding pipette. To prepare for determination of the
blastomere centroid, the user moves the holding micropipette towards the embryo and immobilizes
it by applying a pressure of -5hPa. The PZD micropipette is brought into camera view, where then
the holding pipette, along with the embryo, move out of the camera view in order to proceed to
micropipette calibration.
The user then opens the developed micropipette calibration program on Matlab. The only
requirement from the user is to input the scaling factor, ๐ , depending on the objective lens used for
the experiment. For either the 20x or 40x lenses, the scaling factor will either be 0.32244 ฮผm/px
or 0.16043 ฮผm/px respectively. Then the PZD micropipette calibration is performed automatically,
allowing for the subsequent programs to move the PZD micropipette to desired positions with only
camera coordinates, rather than micromanipulator coordinates.
To acquire the image stack of z-stack images, Micromanager is used to capture images and save
them as TIF files, and Prior Scientific Controller is used for moving the objective lens along the
z-axis. Starting from below the bottom of the embryo, a total of 60 z-stack images are successively
captured over 120 ฮผm, centered at the middle of the embryo, to ensure the entire embryo is captured
within the image stack data. The total time taken to capture these images 7.5 s, at 0.125 s with 3
ms of exposure per z-stack image. These parameters are required with user input and are performed
manually. However, once the user initiates the image acquisition program button, the automated
program begins, moving the focal plane, acquiring the images and storing them as an image stack
in TIF files. Once the z-stack images are acquired, Micromanager converts the file into an easily
accessible TIF file, ready for Matlab to use. Currently, there does not exist a software bridge
between Matlab and Micromanager, and hence image stacks acquired from Micromanager cannot
automatically be read through Matlab. Therefore, once saved onto the computer, the image stacks
were manually exported to Matlab, which then proceeds with the automated BOI centroid
calculation process.
Page 72
55
Once the TIF file of the z-stack image data is read by Matlab, the user sets parameters indicting
which blastomere is the blastomere of interest. Then, the program undergoes the automated
blastomere segmentation process, involving generating a region of interest, creating a low-cost
energy path, and computing the centroid coordinate, ๐ถฬ
, of the blastomere of interest as described
in Section 3.2. The time to compute the centroid coordinate of the BOI is approximately 45
minutes, with current computing power. This computed centroid is with reference to the camera
frame, which then is to be transformed into micromanipulator frame with the use of another Matlab
program. This is so the micromanipulator can then translate to its targeted position, when provided
coordinates within the camera frame. There is a software connection between Matlab and the
Scientific Patchstar micromanipulators, and Matlab is capable of controlling the
micromanipulators by sending position coordinates within the micromanipulator frame. The
micropipette then translates to the calculated blastomere centroid in a sequential pattern as PBVS,
as detailed in Section 3.3.2, and the automated task is successfully completed. The former
procedures, from Matlab importing the z-stack image data to moving the micropipette to a targeted
position, is entirely automated and requires no user input.
Figure 4.1: Flowchart of overall BOI centroid computation and visual servo process. The boxes
represent tasks, whereas the arrows represent the procession from one task to another. Orange
boxes and arrows represent tasks performed manually. Whereas the blue boxes and arrows
represent automatically performed tasks.
Page 73
56
4.2 Experimental Results
To validate the proposed automated procedures, automated blastomere segmentation and visual
servoing was performed on a 2-cell mouse embryo, as a proof of concept. The centroid calculations
are verified with results acquired manually, whereas the visual servo is verified experimentally.
A sample experiment is performed to compute the BOI centroid, as described in Section 4.1. Once
the BOI segmentation and centroid computing algorithms of Section 3.2 are performed, they output
a calculated BOI centroid of ๐ถฬ
= (๏ฟฝฬ
๏ฟฝ, ๏ฟฝฬ
๏ฟฝ, ๐งฬ
). This centroid resulting from the algorithms are verified
with manual segmentation of the blastomeres, where the centroid determined by manual
segmentation is labeled as ๐ถ๐ = (๐๐๐ฅ, ๐๐๐ฆ, ๐๐๐ง). The manual segmentation is performed by user
observation. The z-stack images from the image stack are obtained sequentially and the xy-
centroid coordinates are found by observation. The sample data obtained can be seen in Table 1 in
Appendix B. The error along the x and y axes are ๐๐ฅ = |๐๐๐ฅ โ ๐๐๐ฅ| and ๐๐ฆ = |๐๐๐ฆ โ ๐๐๐ฆ|
respectively. On average, ๐๐ฅ is found to be approximately 5.751 pixels (1.854 ฮผm or 9.27%), and
๐๐ฆ is found to be approximately 1.939 pixels (0.625 ฮผm or 3.125%), as calculated in Tables B.2
and B.3 in Appendix B. The error along the z-axis is ๐๐ง = |๐๐๐ง โ ๐งฬ
|, where ๐๐๐ง is the z-coordinate
used for the z-stack image at the middle of the image stack, also used to find the centroid
approximation for the ROI in Section 3.2.1. ๐๐ง is found to be approximately 11.774 ฮผm
(approximately 6 z-stack images or 58.89%).
Although this verification method is prone to human error, a better way to segment is to manually
outline the boundary of the blastomere at every z-stack image of the image stack via drawing
software, and then compute the resulting xy-coordinates of the centroid. Therefore, the area can
then be produced for every image stack, and an accurate z-coordinate can be determined, and hence
a more accurate model for comparison can be used to determine error and validate the algorithms.
However, the accuracy of the blastomere related tasks, such as blastomere aspiration, depending
on the dimensions of the aspirating micropipette. An aspirating micropipette must have an outer
diameter less than that of the object that it is aspirating, in order to correctly aspirate via suction.
The normal inner diameter of an aspirating micropipette is between 30-40 ฮผm, and the error
between the observed ๐ถ๐ and the calculated ๐ถฬ
are within these required tolerances.
Page 74
57
A sample experiment is performed to move the PZD micropipette, to a given targeted position.
The PBVS algorithms are used, as described in Section 3.3 to move the micropipette tip to the
targeted position. The PBVS algorithms are under closed-loop control, and allow the
micromanipulator to move the micropipette to a targeted 3D Cartesian coordinate, rather than
camera coordinates. The algorithm converts the targeted position from the camera frame of
reference to the robot frame of reference for the micromanipulator to move. Once the PBVS
algorithms of Section 3.3 are performed, the micropipette moves to the given position. The visual
servo algorithms are verified based on how close the micropipette tip is to this given coordinate.
After calibration and coordinate frame transformation from the camera frame to the
micromanipulator frame, coordinates in the camera frame can be given to the micromanipulator,
and the micropipette will move to the respective coordinate in micromanipulator frame. A sample
target coordinate is given as, ๐ = (๐ฅ๐บ , ๐ฆ๐บ , ๐ง๐บ) = (950, 725, 28). The PZD micropipette then
moves from its initial position, ๐1 = (1275, 550, 31), to the second intermediate position, ๐2 =
(1275, 550, 28), to the third, ๐3 = (1275, 725, 28), and to the final targeted position of the BOI
centroid, ๐ถฬ
= ๐4 = (950, 725, 28), successively while following the steps as described in Section
3.3.1, and are found in Table 4 in Appendix C. A figure for this example process is shown in Figure
4.2.
The error is found by obtaining the exact coordinates of the micropipette tip within the camera
frame, to determine the accuracy of the micropipette within a visual servoing environment. The
true micropipette tip position for the experiment are found in Table 5, as ๐ = (๐ฅ๐ , ๐ฆ๐ , ๐ง๐). The
error for each intermediate position is found as ๐ = (๐ฅ๐ธ , ๐ฆ๐ธ , ๐ง๐ธ), where ๐ฅ๐ธ = |๐ฅ๐ โ ๐ฅ๐บ|, ๐ฆ๐ธ =
|๐ฆ๐ โ ๐ฆ๐บ|, and ๐ง๐ธ = |๐ง๐ โ ๐ง๐บ| and shown in Table 6. The errors along the x and y axes are found
to be approximately within 25 pixels. These errors are largely due to possible calibration error
along the x-axis. Overall, the pixels are observed to be within 8 ฮผm or 40%, where if the
blastomeres are approximately 50 ฮผm, the errors are within the required tolerances for tasks, such
as blastomere aspiration due to the size of the aspirating micropipette being approximately 30-40
ฮผm.
Page 75
58
Figure 4.2: Sample Experiment of Visual Servoing.
Page 76
59
4.3 Discussion of Results
Given the results, it is important to discuss possible cases of failure of both the blastomere
segmentation and visual servoing algorithms.
Image acquisition and processing parameters greatly influence the blastomere segmentation
process, specifically in finding the low-cost energy path. Image acquisition parameters such as the
objective lens is used, camera exposure time and binning, the distance between consecutive z-stack
images, ๐๐ผ, the time take to acquire z-stack images of an image stack, and the size of the image
stack, ๐ all play a role in the results. These parameters may influence the need to change other
parameters in image processing, such as size of the standard deviation filter, threshold for
binarizing an image, size of the structuring element, to the sizes of ROI radii and graph structures.
Experiments and its parameters are to be kept consistent to achieve consistent results. Changes
between experiments require changes for these parameters, until successful results are achieved.
An issue relevant to image processing arises due to the translucency of the embryo. Although the
zona pellucida and perivitelline space of the embryo are transparent, blastomeres are slightly
opaque. Hence, as can be seen in Figure 4.3, only one side of the blastomere is visible to
microscope. In the case of an inverted brightfield microscope, the boundaries of bottom half of the
blastomeres are visible, and are observed as white circular rings. The upper half of the blastomere
is not visible to the microscope (Figure 4.3). Therefore, to compute the centroid of the BOI, only
data obtained that provides a clear view of the boundaries of the blastomere are processed. In the
case of the sample experiment, only z-stack images within ๐ผ3 and ๐ผ36 are visible and processed.
Determining the size of the region of interest raises another issue. The ROI must encompass the
boundary of the blastomere at every z-stack image. The number of pixels in the ROI impact
significantly on image processing time. The main components that effect the structure of the ROI
are the radii, ๐โฒ and ๐โฒโฒ, and number of ๐ samples, ๐๐. The larger the number of points of ๐๐ and
๐๐ used for the image array ๐ฝ, the greater the computation time. Note that only the range between
๐โฒ and ๐โฒโฒ changes the number of sampling points, ๐๐, and not the physical values of the radii. For
the ROI to account for the changing radii of the spherically shaped blastomere at every z-stack
image, the radii, ๐โฒ and ๐โฒโฒ, change according to the blastomere boundary radii, however ๐๐ is kept
Page 77
60
consistent. Uncertainty arises when the blastomere lies within an unknown region of the embryo
in 3D space. The ROI must be a greater size, to accommodate for such uncertainty.
Figure 4.3: Various z-stack images of embryo. (a) z-stack image at ๐ผ14. Note the white circularly
shaped outline within the embryo. This is the boundary of the blastomere at this z-stack image. (b)
z-stack image at ๐ผ31. Also used as the middle of image stack due to it being the z-stack image with
the largest blastomere boundary. (c) z-stack image at ๐ผ44. Blastomere boundary is not visible due
to blastomere opacity.
Page 78
61
One issue that arises with the automated micropipette calibration procedure is due to the solid
sharp tip of the PZD micropipette. Two long sides of the micropipette join to form a sharp tip,
instead of the flat tip, similar to a holding micropipette (Figure 4.4). The image processing
procedure checks for the tip based on when the drawn line, as detailed in Section 3.3.1, and
encounters the micropipette side wall. The angle of the line, ๐ผ, greatly affects where the line will
encounter. With a sharp tip, such as that of a PZD micropipette, the likelihood of initially acquiring
an image of the tip is low. For this reason, the micropipette calibration will be in error, but
insufficient to lead to visual servo failure for blastomere aspiration. This is one of the causes of
error within the visual servo algorithms.
Figure 4.4: Comparison of Micropipette Tips for Calibration.
With these issues taken care of during experiments, then the blastomere segmentation and visual
servo algorithms can successfully perform the required tasks.
Page 79
62
Conclusions
5.1 Summary and Conclusions
The development of algorithms for automated single cell surgery are crucial for IVF and PGD
tasks. In this work, a detailed literature review is first presented, focusing on the various types and
limitations of microscopy, depth of field, and various image processing and cell segmentation
techniques. The literature review also overviews two visual servo approaches, including image-
based, and position-based visual servoing. Obtaining the blastomere of early-stage embryos is vital
for processes such as PGD genetic testing purposes. The position of the centroid in 3D Cartesian
space of a blastomere of interest is knowledgeable in order to easily perform position-based visual
servoing procedures. Algorithms are developed to acquire z-stack images via an inverted
brightfield microscope, compute centroid coordinates of a BOI within an embryo in 3D Cartesian
space, and move a micropipette to a computed position using position-based visual servo control,
which are separated into a three step sections, as shown in the methodology.
First, a method for automating the collection of embryo image data is performed. The main goal
for this section is to acquire image data of the embryo, along with the BOI, to be used in subsequent
sections. Starting from below the bottom of the embryo and moving the focal plane along the z-
direction to above the embryo, a sequence of images is captured successively at equally spaced
focal planes, known as z-stack images. The collection of z-stack images is known as the image
stack. The image stack is saved and exported as a TIF file for further analysis and processing.
Acquisition of the image stack is obtained with an inverted brightfield microscope.
Given the series of z-stack images, the main goal for this section is to compute the centroid of the
BOI in 3D Cartesian space, for PGD genetic testing purposes. The image stack of z-stack images
is then exported to Matlab, where it undergoes a series of image processing steps to compute the
centroid of the BOI, with a three-step process. The first step is to obtain the z-stack image from
the middle of the image stack, and compute an approximate centroid for the region of interest used
in the next step. This follows a series of image processing steps, involving standard deviation filter,
binarized thresholds, area filters, and structuring elements creating a blob of the two blastomeres.
With image processing, the blastomeres in the blob are separated, leaving the two blastomere
Page 80
63
where then the approximate centroids are computed. Only the BOI is taken for further examination.
An ROI is then developed, centered at this approximate centroid for the subsequent steps. The
second step is to segment the blastomere at this z-stack image. The ROI is converted into polar
coordinates and assigned energy values. A graph structure is then constructed, incorporating the
z-stack images both below and above the current z-stack image, where then a low-cost energy path
is generated in order to segment the blastomere at the respective z-stack image, which then is
converted back into Cartesian coordinates. The third step begins with carrying through Steps 1 and
2 for all z-stack images, and subsequently, with the use of a weighted averaging equations,
computes the centroid of the BOI in 3D Cartesian space.
Given the computed BOI centroid coordinates in 3D Cartesian space, the main goal for this section
is to move a micropipette to the computed position with PBVS, as a proof of concept. First, the
micropipette is calibrated in order to accurately move to a target position. The second section is
moving the micropipette to the computed BOI centroid coordinates in 3D Cartesian space. This is
performed through visual servo, and by the micropipette moving through a series of intermediate
steps to reach the computed BOI coordinates. Specifically, PBVS is used because it is technique
that controls a robot micromanipulator with feedback from visual images based on 3D positional
coordinates, instead of 2D camera coordinates. This is useful due to the shallow depth of field
found in microscope images at this scale.
The results of this methodology provide a guide from a userโs standpoint, as well as sample
experimental results from a 2-cell stage embryo. The results are found to provide accurate results
for computing the blastomere centroid, and for micropipette visual servoing. The accuracy of
results are sufficient for cell surgery related tasks, such as blastomere aspiration for PGD.
In conclusion, all the objectives in this research are achieved. The developed algorithms compute
and determine the centroid coordinates of a BOI within an embryo in 3D Cartesian space, and
move a micropipette to a computed position using PBVS control. The algorithms also integrate
with the existing inverted brightfield microscope setup, equipped with two robotic
micromanipulators and a motorized stage.
Page 81
64
5.2 Contributions
In this work, the following contributions are made:
1. The research proposes a method to obtain image data from across the z-direction of an
embryo, known as z-stack images.
2. From obtained z-stack images, the research proposed an automated image processing
procedure to determine the centroid of a BOI, and a visual servo procedure to move a
micropipette to this computed centroid position.
3. The proposed research integrates with the existing Nikon Ti-U brightfield microscope
setup, equipped with two robotic micromanipulators and a motorized stage.
5.3 Recommendations and Future Works
Based on the research accomplished in this thesis, future works are suggested in the following
aspects:
1. Explore image processing algorithms with images from high contrast imaging, such as
fluorescent imaging.
2. Program algorithms in a faster language other than Matlab, such as C/C++ or Python, as
well as develop algorithms that incorporate parallel programming for faster computation
in generating the graph structures.
3. Develop higher frequency closed-loop visual servo control algorithms for micropipette
movement, rather than single cycle closed-loop motion.
4. Reconstruct blastomere shapes along with their positions using the blastomere
segmentations along the z-stack images.
5. Investigate machine learning segmentation techniques, such as semantic segmentation, for
blastomere segmentation, and improve on the accuracy of the proposed algorithms.
Page 82
65
References
[1] A. Van Steirteghem et al., โIntracytoplasmic sperm injection,โ Baillieres. Clin. Obstet.
Gynaecol., vol. 8, no. 1, pp. 85โ93, Mar. 1994.
[2] P. Braude, S. Pickering, F. Flinter, and C. M. Ogilvie, โPreimplantation genetic diagnosis,โ
Nat Rev Genet, vol. 3, no. 12, pp. 941โ955, Dec. 2002.
[3] I. Pรฉrez-Ruiz et al., โAnalysis of protein oxidative modifications in follicular fluid from
fertile women: Natural versus stimulated cycles,โ Antioxidants, vol. 7, no. 12, p. 176, Nov.
2018.
[4] F. J. Prados, S. Debrock, J. G. Lemmen, and I. Agerholm, โThe cleavage stage embryo,โ
Hum. Reprod., vol. 27, no. suppl 1, pp. i50โi71, 2012.
[5] Y. Yamanaka, A. Ralston, R. O. Stephenson, and J. Rossant, โCell and molecular regulation
of the mouse blastocyst,โ Developmental Dynamics, vol. 235, no. 9. pp. 2301โ2314, Sep-
2006.
[6] J. C. Harper et al., โThe ESHRE PGD consortium: 10 years of data collection,โ Hum.
Reprod. Update, vol. 18, no. 3, pp. 234โ247, May 2012.
[7] A. P. Ferraretti et al., โAssisted reproductive technology in Europe, 2008: results generated
from European registers by ESHREโ ,โ Hum. Reprod., vol. 27, no. 9, pp. 2571โ2584, 2012.
[8] โStatus of Public Funding for In Vitro Fertilization in Canada and Internationally 1 Context
and Policy Issues,โ Can. Agency Drugs Technol. Heal., no. 14, 2010.
[9] P. Katz et al., โCosts of infertility treatment: results from an 18-month prospective cohort
study,โ Fertil. Steril., vol. 95, no. 3, pp. 915โ921, 2011.
[10] C. Y. Wong and J. K. Mills, โCleavage-stage embryo rotation tracking and automated
micropipette control: Towards automated single cell manipulation,โ in 2016 IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS), pp. 2351โ6.
[11] I. A. Ajamieh, B. Benhabib, and J. K. Mills, โAutomated system for cell manipulation and
Page 83
66
rotation,โ in Proceedings of 2018 IEEE International Conference on Mechatronics and
Automation, ICMA 2018, 2018, pp. 2334โ2339.
[12] A. G. Banerjee and S. K. Gupta, โResearch in automated planning and control for
micromanipulation,โ IEEE Trans. Autom. Sci. Eng., vol. 10, no. 3, pp. 485โ495, Jul. 2013.
[13] E. Shojaei-Baghini, Y. Zheng, and Y. Sun, โAutomated micropipette aspiration of single
cells,โ Ann. Biomed. Eng., vol. 41, no. 6, pp. 1208โ1216, Jun. 2013.
[14] C. Y. Wong and J. K. Mills, โCell extraction automation in single cell surgery using the
displacement method,โ Biomed. Microdevices, vol. 21, no. 3, p. 52, Sep. 2019.
[15] C. Y. Wong, โAutomation of Single Cell Manipulation for Embryo Biopsy,โ Ph.D
dissertation, University of Toronto, 2017.
[16] R. Wegerhoff, O. Weidlich, and M. Kรคssens, Basics of Light Microscopy and Imaging, vol.
1. GIT VERLAG GmbH & Co. KG, 2006.
[17] E. Tsichlaki and G. Fitzharris, โNucleus downscaling in mouse embryos is regulated by
cooperative developmental and geometric programs,โ Sci. Rep., vol. 6, Jun. 2016.
[18] A. K. Hadjantonakis and V. E. Papaioannou, โDynamic in vivo imaging and cell tracking
using a histone fluorescent protein fusion in mice,โ BMC Biotechnol., vol. 4, Dec. 2004.
[19] J. Zheng et al., โLabel-free subcellular 3D live imaging of preimplantation mouse embryos
with full-field optical coherence tomography,โ J. Biomed. Opt., vol. 17, no. 7, p. 0705031,
Jun. 2012.
[20] K. McDole, Y. Xiong, P. A. Iglesias, and Y. Zheng, โLineage mapping the pre-implantation
mouse embryo by two-photon microscopy, new insights into the segregation of cell fates,โ
Dev. Biol., vol. 355, no. 2, pp. 239โ249, 2011.
[21] L. D. M. Ottosen, J. Hindkjรฆr, and J. Ingerslev, โLight exposure of the ovum and
preimplantation embryo during ART procedures,โ J. Assist. Reprod. Genet., vol. 24, no. 2โ
3, pp. 99โ103, Mar. 2007.
Page 84
67
[22] K. O. Pomeroy and M. L. Reed, โThe effect of light on embryos and embryo culture,โ 2013.
[23] D. Inoue and J. Wittbrodt, โOne for all-a highly efficient and versatile method for
fluorescent immunostaining in fish embryos,โ PLoS One, vol. 6, no. 5, 2011.
[24] D. Zhang et al., โSupplement of betaine into embryo culture medium can rescue injury
effect of ethanol on mouse embryo development,โ Sci. Rep., vol. 8, no. 1, Dec. 2018.
[25] N. Ma, N. R. de Mochel, P. D. A. Pham, T. Y. Yoo, K. W. Cho, and M. A. Digman, โLabel-
free assessment of pre-implantation embryo quality by the Fluorescence Lifetime Imaging
Microscopy (FLIM)-phasor approach,โ bioRxiv, p. 286682, Mar. 2018.
[26] W. Lang, โNomarski differential interference-contrast microscopy I. Fundamentals and
experimental designs,โ vol. 70, pp. 16โ114, 1968.
[27] B. P. Abbott et al., โObservation of Gravitational Waves from a Binary Black Hole Merger,โ
Phys. Rev. Lett., vol. 116, no. 6, p. 61102, Feb. 2016.
[28] J. A. Newmark et al., โDetermination of the number of cells in preimplantation embryos by
using noninvasive optical quadrature microscopy in conjunction with differential
interference contrast microscopy,โ Microsc. Microanal., vol. 13, no. 2, pp. 118โ127, Apr.
2007.
[29] D. R. Soll, D. Wessels, P. J. Heid, and E. Voss, โComputer-assisted reconstruction and
motion analysis of the three-dimensional cell.,โ ScientificWorldJournal., vol. 3, pp. 827โ
841, 2003.
[30] Z. Wang, W. T. Ang, S. Y. Min Tan, and W. T. Latt, โAutomatic segmentation of zona
pellucida and its application in cleavage-stage embryo biopsy position selection,โ in 2015
37th Annual International Conference of the IEEE Engineering in Medicine and Biology
Society (EMBC), 2015, pp. 3859โ3864.
[31] A. Karlsson, N. C. Overgaard, and A. Heyden, โAutomatic Segmentation of Zona Pellucida
in HMC Images of Human Embryos.โ
[32] A. Giusti, G. Corani, L. M. Gambardella, C. Magli, and L. Gianaroli, โLighting-Aware
Page 85
68
Segmentation of Microscopy Images for In Vitro Fertilization,โ pp. 576โ585, 2009.
[33] A. Giusti, G. Corani, L. Gambardella, C. Magli, and L. Gianaroli, โBlastomere
segmentation and 3D morphology measurements of early embryos from hoffman
modulation contrast image stacks,โ in 2010 7th IEEE International Symposium on
Biomedical Imaging: From Nano to Macro, ISBI 2010 - Proceedings, 2010, pp. 1261โ1264.
[34] M. Johansson, T. Hardarson, and K. Lundin, โThere is a cutoff limit in diameter between a
blastomere and a small anucleate fragment,โ J. Assist. Reprod. Genet., vol. 20, no. 8, pp.
309โ313, 2003.
[35] R. Rottermann and P. Bauer, โHow Sharp Images Are Formed.โ 28-Apr-2010.
[36] F. J. Prados, S. Debrock, J. G. Lemmen, and I. Agerholm, โThe cleavage stage embryo,โ
Hum. Reprod., vol. 27, no. suppl 1, pp. i50โi71, 2012.
[37] G. Cauffman, G. Verheyen, H. Tournaye, and H. Van De Velde, โDevelopmental capacity
and pregnancy rate of tetrahedral-versus non-tetrahedral-shaped 4-cell stage human
embryos,โ J. Assist. Reprod. Genet., vol. 31, no. 4, pp. 427โ434, 2014.
[38] I. M. Bahadur and J. K. Mills, โRobust autofocusing in microscopy using particle swarm
optimization,โ in 2013 IEEE International Conference on Mechatronics and Automation,
IEEE ICMA 2013, 2013, pp. 213โ218.
[39] Z. Wang, C. Feng, W. T. Ang, S. Y. M. Tan, and W. T. Latt, โAutofocusing and Polar Body
Detection in Automated Cell Manipulation,โ IEEE Trans. Biomed. Eng., vol. 64, no. 5, pp.
1099โ1105, May 2017.
[40] D. A. Morales, E. Bengoetxea, and P. Larra, โAutomatic Segmentation of Zona Pellucida
in Human Embryo Images Applying an Active Contour Model,โ Med. Underst. Anal., no.
February 2016, pp. 104โ116, 2008.
[41] U. D. Pedersen, O. F. Olsen, and N. H. Olsen, โA multiphase variational level set approach
for modelling human embryos.โ
[42] U. Damgaard Pedersen, โModeling Human Embryos Using a Variational Level Set
Page 86
69
Approach,โ 2004.
[43] O. Ronneberger, P. Fischer, and T. Brox, โU-net: Convolutional networks for biomedical
image segmentation,โ in Lecture Notes in Computer Science (including subseries Lecture
Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2015, vol. 9351, pp.
234โ241.
[44] I. V. Ilina, Y. V. Khramova, M. A. Filatov, M. L. Semenova, and D. S. Sitnikov,
โApplication of femtosecond laser scalpel and optical tweezers for noncontact biopsy of late
preimplantation embryos,โ High Temp., vol. 53, no. 6, pp. 804โ809, Nov. 2015.
[45] I. V. Ilโina, D. S. Sitnikov, A. V. Ovchinnikov, M. B. Agranat, Y. V. Khramova, and M. L.
Semenova, โNoncontact microsurgery and micromanipulation of living cells with combined
system femtosecond laser scalpel-optical tweezers,โ in Biophotonics: Photonic Solutions
for Better Health Care III, 2012, vol. 8427, p. 84270S.
[46] C. Jiang and J. K. Mills, โPlanar Cell Orientation Control System Using a Rotating Electric
Field,โ IEEE/ASME Trans. Mechatronics, vol. Volume 20, no. Number 5, Oct. 2014.
[47] H. K. H. Chu, โAn Automated Micromanipulation System for 3D Parallel Microassembly
by An Automated Micromanipulation System for 3D Parallel Microassembly,โ 2011.
[48] F. Chaumette and S. Hutchinson, โVisual servo control. I. Basic approaches,โ IEEE Robot.
Autom. Mag., vol. 13, no. 4, pp. 82โ90, Dec. 2006.
[49] F. Chaumette and S. Hutchinson, โVisual servo control. II. Advanced approaches
[Tutorial],โ Robot. Autom. Mag. IEEE, vol. 14, pp. 109โ118, 2007.
[50] X. Liu and Y. Sun, โVisually Servoed Orientation Control of Biological Cells in
Microrobotic Cell Manipulation,โ in Springer Tracts in Advanced Robotics, 2009, vol. 54,
pp. 179โ187.
[51] B. Thuilot, P. Martinet, L. Cordesses, and J. Gallice, โPosition based visual servoing:
Keeping the object in the field of vision,โ Proc. - IEEE Int. Conf. Robot. Autom., vol. 2, pp.
1624โ1629, 2002.
Page 87
70
[52] S. Sidhu and J. K. Mills, โAutomated Blastomere Segmentation for Early-Stage Embryo
Using 3D Imaging Techniques,โ in Proceedings of 2019 IEEE International Conference on
Mechatronics and Automation, ICMA 2019, 2019, pp. 1588โ1593.
[53] A. Giusti and others, โSegmentation of Human Zygotes in Hoffman Modulation Contrast
Images,โ Proc. Med. Image Underst. Anal., no. c, pp. 189โ193, 2009.
[54] D. Guichard, โIntroduction to Combinatorics and Graph Theory.โ p. 147, 2016.
[55] C. Y. Wong and J. K. Mills, โCleavage-stage embryo rotation tracking and automated
micropipette control: Towards automated single cell manipulation,โ in IEEE International
Conference on Intelligent Robots and Systems, 2016, vol. 2016-Novem, pp. 2351โ2356.
[56] C. Y. Wong and J. K. Mills, โAutomation and Optimization of Multipulse Laser Zona
Drilling of Mouse Embryos During Embryo Biopsy,โ IEEE Trans. Biomed. Eng., vol. 64,
no. 3, pp. 629โ636, Mar. 2017.
[57] I. M. Bahadur and J. K. Mills, โA mechanical perforation procedure for embryo biopsy,โ in
2013 ICME International Conference on Complex Medical Engineering, CME 2013, 2013,
pp. 313โ318.
Page 88
71
Appendices
Appendix A. Experiment of Visual Servoing
Page 89
72
Appendix B. Sample Blastomere Coordinate Calculations
๐ฐ๐ ๐๐๐ (in px) ๐๐๐ (in px) ๐จ๐ (in px2) ๐๐๐ (in px) ๐๐๐ (in px)
3 407.0214 1180.426 2050 426 1166
4 419.1576 1177.493 1487 428 1166
5 418.5451 1172.339 2566 430 1168
6 425.0089 1173.815 1949 428 1168
7 426.2316 1176.548 2186 428 1168
8 426.1681 1179.861 3018 426 1168
9 423.563 1177.703 4334 426 1172
10 419.9018 1178.01 4334 424 1172
11 423.2451 1180.032 4996 422 1170
12 425.0008 1180.104 5141 426 1174
13 421.8036 1181.056 5870 424 1174
14 423.9012 1185.474 6242 428 1174
15 422.8728 1178.996 7930 426 1176
16 425.4549 1174.813 9187 432 1174
17 413.3434 1159.006 9741 428 1174
18 417.7459 1179.887 10241 430 1178
19 427.872 1163.592 12417 426 1176
20 395.5784 1166.374 24160 436 1182
21 384.4635 1162.628 21686 420 1178
22 397.111 1162.045 20582 418 1172
23 423.3193 1186.507 24915 422 1176
24 424.1436 1186.029 24029 420 1180
25 423.7309 1184.987 22637 422 1178
26 420.3445 1188.147 22637 426 1170
27 422.7234 1170.544 19932 424 1174
28 421.6773 1185.161 21961 426 1176
29 421.9259 1185.353 21827 426 1176
30 421.4112 1182.425 21882 420 1178
31 421.7194 1181.189 21909 422 1182
32 423.0068 1183.188 21788 422 1182
33 423.396 1183.624 21988 426 1182
34 427.884 1174.486 22076 426 1182
35 428.7528 1174.456 23348 426 1182
36 424.4434 1157.622 21009 428 1180
Table 1: Sample Blastomere Coordinate Data
Page 90
73
๏ฟฝฬ
๏ฟฝ (in px) ๏ฟฝฬ
๏ฟฝ (in px) โ (๐จ๐)๐ต๐=๐
(in px2) โ (๐๐ โ
๐จ๐)
๐ต๐=๐ (in px3) ๏ฟฝฬ
๏ฟฝ (in ๐) โ(๐๐๐)
(in px) โ(๐๐๐)
(in px)
419.7785 1176.88 472055 11854800 25.1131 425.5294
1174.941
Table 2: Sample Blastomere Coordinate Calculations
(in px) (in ฮผm)
๐๐ 5.75093 1.85434 (9.27%)
๐๐ 1.938776 0.625141 (3.125%)
๐๐ 5.887 11.774 (58.89%)
Table 3: Sample Blastomere Coordinate Calculation Errors
Page 91
74
Appendix C. Sample Micropipette Tip Coordinate Calculations
๐๐ฎ (in px) ๐๐ฎ (in px) ๐๐ฎ (in ๐)
๐๐ 1275 550 31
๐๐ 1275 550 28
๐๐ 1275 725 28
๐๐ 950 725 28
Table 4: Sample Given Micropipette Tip Coordinates
๐๐ป (in px) ๐๐ป (in px) ๐๐ป (in ๐)
๐๐ 1255 550 31
๐๐ 1255 550 28
๐๐ 1254 724 28
๐๐ 925 726 28
Table 5: Sample True Micropipette Tip Coordinates
๐๐ฌ (in px) ๐๐ฌ (in px) ๐๐ฌ (in ๐)
๐๐ 20 (32.24%) 0 (0%) 0
๐๐ 20 (32.24%) 0 (0%) 0
๐๐ 21 (33.86%) 1 (1.61%) 0
๐๐ 25 (40.31%) 1 (1.61%) 0
Table 6: Sample Micropipette Tip Coordinate Errors
Page 92
75
Appendix D. Sample of Blastomere Segmentation Across Image Stack
z-Stack Image - 3 z-Stack Image - 4
z-Stack Image - 5 z-Stack Image - 6
z-Stack Image - 7 z-Stack Image - 8
Page 93
76
z-Stack Image - 9 z-Stack Image - 10
z-Stack Image - 11 z-Stack Image - 12
z-Stack Image - 13 z-Stack Image - 14
Page 94
77
z-Stack Image - 15 z-Stack Image - 16
z-Stack Image - 17 z-Stack Image - 18
z-Stack Image - 19 z-Stack Image - 20
Page 95
78
z-Stack Image - 21 z-Stack Image - 22
z-Stack Image - 23 z-Stack Image - 24
z-Stack Image - 25 z-Stack Image - 26
Page 96
79
z-Stack Image - 27 z-Stack Image - 28
z-Stack Image - 29 z-Stack Image - 30
z-Stack Image - 31 z-Stack Image - 32
Page 97
80
z-Stack Image - 33 z-Stack Image - 34
z-Stack Image - 35 z-Stack Image - 36