0
Research Papers

Computational Investigation of Drug Action on Human-Induced Stem Cell-Derived Cardiomyocytes

[+] Author and Article Information
Ralf Frotscher

Biomechanics Laboratory,
Institute for Bioengineering,
Aachen University of Applied Sciences,
Jülich 52428, Germany
e-mail: frotscher@fh-aachen.de

Jan-Peter Koch

Biomechanics Laboratory,
Institute for Bioengineering,
Aachen University of Applied Sciences,
Jülich 52428, Germany

Manfred Staat

Professor
Biomechanics Laboratory,
Institute for Bioengineering,
Aachen University of Applied Sciences,
Jülich 52428, Germany
e-mail: m.staat@fh-aachen.de

1Corresponding author.

Manuscript received November 27, 2014; final manuscript received March 13, 2015; published online June 2, 2015. Assoc. Editor: Pasquale Vena.

J Biomech Eng 137(7), 071002 (Jul 01, 2015) (7 pages) Paper No: BIO-14-1593; doi: 10.1115/1.4030173 History: Received November 27, 2014; Revised March 13, 2015; Online June 02, 2015

We compare experimental and computational results for the actions of the cardioactive drugs Lidocaine, Verapamil, Veratridine, and Bay K 8644 on a tissue monolayer consisting of mainly fibroblasts and human-induced pluripotent stem cell-derived cardiomyocytes (hiPSc-CM). The choice of the computational models is justified and literature data is collected to model drug action as accurately as possible. The focus of this work is to evaluate the validity and capability of existing models for native human cells with respect to the simulation of pharmaceutical treatment of monolayers and hiPSc-CM. From the comparison of experimental and computational results, we derive suggestions for model improvements which are intended to computationally support the interpretation of experimental results obtained for hiPSc-CM.

FIGURES IN THIS ARTICLE
<>
Copyright © 2015 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Fig. 1

Bulge test in the CellDrumTM: (a) pressure-deflection curves, (b) change in deflection during individual beats, and (c) schematic drawing (figure composed from Ref. [3])

Grahic Jump Location
Fig. 2

Inotropic effect of Lidocaine in experiment (dashed line), simulation paced at 1 Hz (continuous, square markers), and paced at measured frequencies (continuous, triangular markers)

Grahic Jump Location
Fig. 3

Inotropic effect of Verapamil in experiment (dashed line), simulation using the models TT–NHS (continuous, square markers), and MNT–HMT (continuous, circular markers)

Grahic Jump Location
Fig. 4

Inotropic effect of Veratridine in experiment (dashed line), simulation paced at 1 Hz (continuous, square markers), and paced at measured frequencies (continuous, triangular markers)

Grahic Jump Location
Fig. 5

Inotropic effect of Bay K 8644 in experiment (dashed line), simulation paced at 1 Hz (continuous, square markers), and paced at measured frequencies (continuous, triangular markers)

Grahic Jump Location
Fig. 8

Chronotropic effect of Bay K 8644 in experiment, Chandler simulation (continuous, diamond markers), and Seemann simulation (continuous, rectangular markers)

Grahic Jump Location
Fig. 7

Chronotropic effect of Veratridine in experiment (dashed line), MNT simulation (continuous, circular markers), Chandler simulation (continuous, diamond markers), and Seemann simulation (continuous, rectangular markers)

Grahic Jump Location
Fig. 6

Chronotropic effect of Lidocaine in experiment (dashed line), MNT simulation (continuous, circular markers), Chandler simulation (continuous, diamond markers), and Seemann simulation (continuous, rectangular markers)

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In