Significance: Current medical imaging systems have many limitations for applications in cardiovascular diseases. New technologies may overcome these limitations. Particularly interesting are technologies for diagnosis of cardiac diseases, e.g. fibrosis, myocarditis, and transplant rejection.
Aim: To introduce and assess a new optical system capable of assessing cardiac muscle tissue using light-scattering spectroscopy (LSS) in conjunction with machine learning.
Approach: We applied an ovine model to investigate if the new LSS system is capable of estimating densities of cell nuclei in cardiac tissue. We measured the nuclear density using fluorescent labeling, confocal microscopy, and image processing. Spectra acquired from the same cardiac tissues were analyzed with spectral clustering and convolutional neural networks to assess feasibility and reliability of density quantification.
Results: Spectral clustering revealed distinct groups of spectra correlated to ranges of nuclear density. Convolutional neural networks correctly classified 3 groups of spectra with low, medium, or high nuclear density with 95.00±11.77% (mean and standard deviation) accuracy. The analysis revealed sensitivity of the accuracy to wavelength range and subsampling of spectra.
Conclusions: LSS and machine learning are capable of assessing nuclear density in cardiac tissues. The approach could be useful for diagnosis of cardiac diseases associated with an increase of nuclei.
This dataset accompanies the research article entitled, "Vibration of Natural Rock Arches and Towers Excited by Helicopter-Sourced Infrasound," where we investigate the vibration response of seven landforms to helicopter-sourced infrasound during controlled flight. Included are time-series vibration data of the landforms and nearby ground during and before helicopter flight, time-series infrasound data, 3D photogrammetry models of the studied landforms, and GPS data from the helicopter.
We analyzed 4,754 broadband seismic recordings of the SKS, SKKS, and SPdKS wavefield from 13 high quality events sampling the Samoa ultralow-velocity zone (ULVZ). We measured differential travel-times and amplitudes between the SKKS and SKS arrivals, which are highly sensitive to the emergence of the SPdKS seismic phase, which is in turn highly sensitive to lowermost mantle velocity perturbations such as generated by ULVZs. We modeled these data using a 2-D axi-symmetric waveform modeling approach and are able to explain these data with a single ULVZ. In order to predict both travel-time and amplitude perturbations we found that a large ULVZ length in the great circle arc direction on the order of 10° or larger is required. The large ULVZ length limits acceptable ULVZ elastic parameters. Here we find that δVS and δVP reductions from 20% to 22% and 15% to 17% respectively gives us the best fit, with a thickness of 26 km. Initial 3-D modeling efforts do not recover the extremes in the differential measurements, demonstrating that 3-D effects are important and must be considered in the future. However, the 3-D modeling is generally consistent with the velocity reductions recovered with the 2-D modeling. These velocity reductions are compatible with a compositional component to the ULVZ. Furthermore, geodynamic predictions for a compositional ULVZ that is moving predict a long linear shape similar to the shape of the Samoa ULVZ we confirm in this study.
and This collection includes radial component displacement seismograms in the time window including the SKS, SKKS and SPdKS seismic arrivals. These data all interact with the Samoa ultra-low velocity zone at the core-mantle boundary. All data used in the study of Krier et al., 2021 (JGR) is included in this collection.
Using a suite of numerical calculations, we consider the long-term evolution of circumbinary debris from the Pluto--Charon giant impact. Initially, these solids have large eccentricity and pericenters near Charon's orbit. On time scales of 100--1000 yr, dynamical interactions with Pluto and Charon lead to the ejection of most solids from the system. As the dynamics moves particles away from the barycenter, collisional damping reduces the orbital eccentricity of many particles. These solids populate a circumbinary disk in the Pluto-Charon orbital plane; a large fraction of this material lies within a `satellite zone' that encompasses the orbits of Styx, Nix, Kerberos, and Hydra. Compared to the narrow rings generated from the debris of a collision between a trans-Neptunian object (TNO) and Charon, disks produced after the giant impact are much more extended and may be a less promising option for producing small circumbinary satellites.
We apply Bayesian inference to instrument calibration and experimental-data uncertainty analysis for the specific application of measuring radiative intensity with a narrow-angle radiometer. We develop a physics-based instrument model that describes temporally varying radiative intensity, the indirectly measured quantity of interest, as a function of scenario and model parameters. We identify a set of five uncertain parameters, find their probability distributions (the posterior or inverse problem) given the calibration data by applying Bayes’ Theorem, and employ a local linearization to marginalize the nuisance parameters resulting from errors-in-variables. We then apply the instrument model to a new scenario that is the intended use of the instrument, a 1.5 MW coal-fired furnace. Unlike standard error propagation, this Bayesian method infers values for the five uncertain parameters by sampling from the posterior distribution and then computing the intensity with quantifiable uncertainty at the point of a new, in-situ furnace measurement (the posterior predictive or forward problem). Given the instrument-model context of this analysis, the propagated uncertainty provides a significant proportion of the measurement error for each in-situ furnace measurement. With this approach, we produce uncertainties at each temporal measurement of the radiative intensity in the furnace, successfully identifying temporal variations that were otherwise indistinguishable from measurement uncertainty.
Objective: In 2018, the Network of the National Libraries of Medicine (NNLM) launched a national sponsorship program to support U.S. public library staff in completing the Medical Library Association’s (MLA) Consumer Health Information Specialization (CHIS). The primary objective of this research project was to determine if completion of the sponsored specialization was successful in improving public library staff ability to provide consumer health information and whether it resulted in new services, programming, or outreach activities at public libraries. Secondary objectives of this research were to determine motivation for and benefits of the specialization and to determine the impact on sponsorship on obtaining and continuing the specialization.
Methods: To evaluate the sponsorship program, we developed and administered a 16-question online survey via REDCap in August 2019 to 224 public library staff that were sponsored during the first year of the program. We measured confidence and competence in providing consumer health information using questions aligned with the eight Core Competencies for Providing Consumer Health Information Services [1]. Additionally, the survey included questions about new consumer health information activities at public libraries, public library staff motivation to obtain the specialization, and whether it led to immediate career gains. To determine the overall value of the NNLM sponsorship, we measured whether funding made it more likely for participants to complete or continue the specialization.
Results: Overall, 136 participants (61%) responded to the survey. Our findings indicated that the program was a success: over 80% of participants reported an increase in core consumer health competencies, with a statistically significant improvement in mean competency scores after completing the specialization. Ninety percent of participants have continued their engagement with NNLM, and over half offered new health information programs and services at their public library. All respondents indicated that completing the specialization met their expectations, but few reported immediate career gains. While over half of participants planned to renew the specialization or obtain the more advanced, Level II specialization, 72% indicated they would not continue without the NNLM sponsorship.
Conclusion: Findings indicate that NNLM sponsorship of the CHIS specialization was successful in increasing the ability of public library staff to provide health information to their community. and This dataset represents the de-identified raw results of a 16-question, online survey (via REDCap) collected in August 2019 to 224 public library staff who were sponsored for a Consumer Health Information Specialization (CHIS). The purpose of the study was to determine whether the sponsorship program had an impact on public library staff to provide consumer health information.
This study investigates impacts of altering subgrid-scale mixing in “convection-permitting” km-scale horizontal grid spacing (∆h) simulations by applying either constant or stochastic multiplicative factors to the horizontal mixing coefficients within the Weather Research and Forecasting model. In quasi-idealized 1-km ∆h simulations of two observationally based squall line cases, constant enhanced mixing produces larger updraft cores that are more dilute at upper levels, weakens the cold pool, rear inflow jet, and front-to-rear flow of the squall line, and degrades the model’s effective resolution. Reducing mixing by a constant multiplicative factor has the opposite effect on all metrics. Completely turning off parameterized horizontal mixing produces bulk updraft statistics and squall line mesoscale structure closest to a LES “benchmark” among all 1-km simulations, although the updraft cores are too undilute. The stochastic mixing scheme, which applies a multiplicative factor to the mixing coefficients that varies stochastically in time and space, is employed at 0.5-, 1-, and 2-km ∆h. It generally reduces mid-level vertical velocities and enhances upper-level vertical velocities compared to simulations using the standard mixing scheme, with more substantial impacts at 1-km and 2-km ∆h compared to 0.5-km. The stochastic scheme also increases updraft dilution to better agree with the LES for one case, but has less impact on the other case. Stochastic mixing acts to weaken the cold pool but without a significant impact on squall line propagation. It also does not affect the model’s overall effective resolution unlike applying constant multiplicative factors to the mixing coefficients.
We consider a scenario where the small satellites of Pluto and Charon grew within a disk of debris from an impact between Charon and a trans-Neptunian object (TNO). After Charon's orbital motion boosts the debris into a disk-like structure, rapid orbital damping of meter-sized or smaller objects is essential to prevent the subsequent reaccretion or dynamical ejection by the binary. From analytical estimates and simulations of disk evolution, we estimate an impactor radius of 30-100 km; smaller (larger) radii apply to an oblique (direct) impact. Although collisions between large TNOs and Charon are unlikely today, they were relatively common within the first 0.1-1 Gyr of the solar system. Compared to models where the small satellites agglomerate in the debris left over by the giant impact that produced the Pluto-Charon binary planet, satellite formation from a later impact on Charon avoids the destabilizing resonances that sweep past the satellites during the early orbital expansion of the binary.
Ground-based measurements of frozen precipitation are heavily influenced by interactions of surface winds with gauge-shield geometry. The Multi-Angle Snowflake Camera (MASC), which photographs hydrometeors in free-fall from three different angles while simultaneously measuring their fall speed, has been used in the field at multiple mid-latitude and polar locations both with and without wind shielding. Here we present an analysis of Arctic field observations — with and without a Belfort double Alter shield — and compare the results to computational fluid dynamics (CFD) simulations of the airflow and corresponding particle trajectories around the unshielded MASC. MASC-measured fall speeds compare well with Ka-band Atmospheric Radiation Measurement (ARM) Zenith Radar (KAZR) mean Doppler velocities only when winds are light (< 5 m/s) and the MASC is shielded. MASC-measured fall speeds that do not match KAZR measured velocities tend to fall below a threshold value that increases approximately linearly with wind speed but is generally < 0.5 m/s. For those events with wind speeds < 1.5 m/s, hydrometeors fall with an orientation angle mode of 12 degrees from the horizontal plane, and large, low-density aggregates are as much as five times more likely to be observed. Simulations in the absence of a wind shield show a separation of flow at the upstream side of the instrument, with an upward velocity component just above the aperture, which decreases the mean particle fall speed by 55% (74%) for a wind speed of 5 m/s (10 m/s). We conclude that accurate MASC observations of the microphysical, orientation, and fall speed characteristics of snow particles require shielding by a double wind fence and restriction of analysis to events where winds are light (< 5 m/s). Hydrometeors do not generally fall in still air, so adjustments to these properties' distributions within natural turbulence remain to be determined.