The objective of this study was to determine the influence of face shields on the concentration of respirable aerosols in the breathing zone of the wearer. The experimental approach involved the generation of poly-dispersed respirable test dust aerosol in a low-speed wind tunnel over 15 minutes, with a downstream breathing mannequin. Aerosol concentrations were measured in the breathing zone of the mannequin and at an upstream location using two laser spectrophotometers that measured particle number concentration over the range 0.25-31 µm. Three face shield designs were tested (A, B and C), and were positioned on the mannequin operated at a high and low breathing rate. Efficiency – the reduction in aerosol concentration in the breathing zone – was calculated as a function of particle size and overall, for each face shield. Face shield A, a bucket hat with flexible shield, had the highest efficiency, approximately 95%, while more traditional face shield designs had efficiency 53-78%, depending on face shield and breathing rate. Efficiency varied by particle size, but the pattern differed among face shield designs. Face shields decreased the concentration of respirable aerosols in the breathing zone, when aerosols were carried perpendicular to the face. Additional research is needed to understand the impact of face shield position relative to the source.
The Differential Emissivity Imaging Disdrometer (DEID) is a new evaporation-based optical and thermal instrument designed to measure the mass, size, density, and type of individual hydrometeors and their bulk properties. Hydrometeor spatial dimensions are measured on a heated metal plate using an infrared camera by exploiting the much higher thermal emissivity of water compared with metal. As a melted hydrometeor evaporates, its mass can be directly related to the loss of heat from the hotplate assuming energy conservation across the hydrometeor. The heat-loss required to evaporate a hydrometeor is found to be independent of environmental conditions including ambient wind velocity, moisture level, and temperature. The difference in heat loss for snow versus rain for a given mass offers a method for discriminating precipitation phase. The DEID measures hydrometeors at sampling frequencies up to 1 Hz with masses and effective diameters greater than 1 µg and 200 µm, respectively, determined by the size of the hotplate and the thermal camera specifications. Measurable snow water equivalent (SWE) precipitation rates range from 0.001 to 200 mm h−1, as validated against a standard weighing bucket. Preliminary field-experiment measurements of snow and rain from the winters of 2019 and 2020 provided continuous automated measurements of precipitation rate, snow density, and visibility. Measured hydrometeor size distributions agree well with canonical results described in the literature. and A new precipitation sensor, the Differential Emissivity Imaging Disdrometer (DEID), is used to provide the first continuous measurements of the mass, diameter, and density of individual hydrometeors. The DEID consists of an infrared camera pointed at a heated aluminum plate. It exploits the contrasting thermal emissivity of water and metal to determine individual particle mass by assuming that energy is conserved during the transfer of heat from the plate to the particle during evaporation. Particle density is determined from a combination of particle mass and morphology. A Multi-Angle Snowflake Camera (MASC) was deployed alongside the DEID to provide refined imagery of particle size and shape. Broad consistency is found between derived mass-diameter and density-diameter relationships and those obtained in prior studies. However, DEID measurements show a generally weaker dependence with size for hydrometeor density and a stronger dependence for aggregate snowflake mass.
This study of the role and impact of the subject selector in academic libraries is unique and long overdue. We focused on the Pac-12 university libraries, a representative sample of nationwide academic libraries. The strength of our investigation is this small, focused sample size and unique statistical analysis of subject specialists. There is a wide variety among these libraries with respect to the hiring requirements for MLIS, the MLIS with an additional advanced-subject master’s degree, and those libraries who hire non-MLIS librarians. This investigation has the possibility of promoting greater awareness for the future of subject specialists in academic libraries.
This dataset accompanies the research article entitled, "Etiology-Specific Remodeling in Ventricular Tissue of Heart Failure Patients and its Implications for Computational Modeling of Electrical Conduction," where we quantified fibrosis and performed electrophysiological simulation to investigate electrical propagation in etiologically varied heart failure tissue samples. Included are raw confocal microscopic images, data for extracting and processing the raw images and script to analyze fibrosis and generate meshes for simulation.
This dataset comprises MODTRAN radiative transfer simulations used to determine scene-specific enhancement spectra for matched filter retrieval of CH4 and CO2 concentrations from imaging spectroscopy data. An example implementation to generate a enhancement spectrum is also included.
Environmental noise may affect hearing and a variety of non-auditory disease processes. There is some evidence that, like other environmental hazards, noise may be differentially distributed across communities based on socioeconomic status. We aimed to a) predict daytime noise pollution levels and b) assess disparities in daytime noise exposure in Chicago, Illinois. We measured 5-minute daytime noise levels (Leq, 5-min) at 75 randomly selected sites in Chicago in March 2019. Geographically based variables thought to be associated with noise were obtained and used to fit a noise land-use regression model to estimate the daytime environmental noise level at the centroid of the census blocks. Demographic and socioeconomic data were obtained from the City of Chicago for the 77 community areas, and associations with daytime noise levels were assessed using spatial autoregressive models. Mean sampled noise level (Leq, 5-min) was 60.6 dBA. The adjusted R2 and root mean square error of the noise land use regression model and the validation model were 0.60 and 4.67 dBA and 0.51 and 5.90 dBA, respectively. Nearly 75% of city blocks and 85% of city communities have predicted daytime noise level higher than 55 dBA. Of the socioeconomic variables explored, only community per capita income was associated with mean community predicted noise levels and was highest for communities with incomes in the 2nd quartile. Both the noise measurements and land-use regression modeling demonstrate that Chicago has levels of environmental noise likely contributing to the total burden of environmental stressors. Noise is not uniformly distributed across Chicago; it is associated with proximity to roads and public transportation and is higher among communities with mid-to-low incomes per capita, which highlights how socially and economically disadvantaged communities may be disproportionately impacted by this environmental exposure.
Significance: Current medical imaging systems have many limitations for applications in cardiovascular diseases. New technologies may overcome these limitations. Particularly interesting are technologies for diagnosis of cardiac diseases, e.g. fibrosis, myocarditis, and transplant rejection.
Aim: To introduce and assess a new optical system capable of assessing cardiac muscle tissue using light-scattering spectroscopy (LSS) in conjunction with machine learning.
Approach: We applied an ovine model to investigate if the new LSS system is capable of estimating densities of cell nuclei in cardiac tissue. We measured the nuclear density using fluorescent labeling, confocal microscopy, and image processing. Spectra acquired from the same cardiac tissues were analyzed with spectral clustering and convolutional neural networks to assess feasibility and reliability of density quantification.
Results: Spectral clustering revealed distinct groups of spectra correlated to ranges of nuclear density. Convolutional neural networks correctly classified 3 groups of spectra with low, medium, or high nuclear density with 95.00±11.77% (mean and standard deviation) accuracy. The analysis revealed sensitivity of the accuracy to wavelength range and subsampling of spectra.
Conclusions: LSS and machine learning are capable of assessing nuclear density in cardiac tissues. The approach could be useful for diagnosis of cardiac diseases associated with an increase of nuclei.
This dataset accompanies the research article entitled, "Vibration of Natural Rock Arches and Towers Excited by Helicopter-Sourced Infrasound," where we investigate the vibration response of seven landforms to helicopter-sourced infrasound during controlled flight. Included are time-series vibration data of the landforms and nearby ground during and before helicopter flight, time-series infrasound data, 3D photogrammetry models of the studied landforms, and GPS data from the helicopter.
We analyzed 4,754 broadband seismic recordings of the SKS, SKKS, and SPdKS wavefield from 13 high quality events sampling the Samoa ultralow-velocity zone (ULVZ). We measured differential travel-times and amplitudes between the SKKS and SKS arrivals, which are highly sensitive to the emergence of the SPdKS seismic phase, which is in turn highly sensitive to lowermost mantle velocity perturbations such as generated by ULVZs. We modeled these data using a 2-D axi-symmetric waveform modeling approach and are able to explain these data with a single ULVZ. In order to predict both travel-time and amplitude perturbations we found that a large ULVZ length in the great circle arc direction on the order of 10° or larger is required. The large ULVZ length limits acceptable ULVZ elastic parameters. Here we find that δVS and δVP reductions from 20% to 22% and 15% to 17% respectively gives us the best fit, with a thickness of 26 km. Initial 3-D modeling efforts do not recover the extremes in the differential measurements, demonstrating that 3-D effects are important and must be considered in the future. However, the 3-D modeling is generally consistent with the velocity reductions recovered with the 2-D modeling. These velocity reductions are compatible with a compositional component to the ULVZ. Furthermore, geodynamic predictions for a compositional ULVZ that is moving predict a long linear shape similar to the shape of the Samoa ULVZ we confirm in this study.
and This collection includes radial component displacement seismograms in the time window including the SKS, SKKS and SPdKS seismic arrivals. These data all interact with the Samoa ultra-low velocity zone at the core-mantle boundary. All data used in the study of Krier et al., 2021 (JGR) is included in this collection.
Using a suite of numerical calculations, we consider the long-term evolution of
circumbinary debris from the Pluto--Charon giant impact. Initially, these solids
have large eccentricity and pericenters near Charon's orbit. On time scales of
100--1000 yr, dynamical interactions with Pluto and Charon lead to the ejection
of most solids from the system. As the dynamics moves particles away from the
barycenter, collisional damping reduces the orbital eccentricity of many particles.
These solids populate a circumbinary disk in the Pluto-Charon orbital plane; a large
fraction of this material lies within a `satellite zone' that encompasses the orbits
of Styx, Nix, Kerberos, and Hydra. Compared to the narrow rings generated from the
debris of a collision between a trans-Neptunian object (TNO) and Charon,
disks produced after the giant impact are much more extended and may be a less promising option for producing small circumbinary satellites.
We apply Bayesian inference to instrument calibration and experimental-data uncertainty analysis for the specific application of measuring radiative intensity with a narrow-angle radiometer. We develop a physics-based instrument model that describes temporally varying radiative intensity, the indirectly measured quantity of interest, as a function of scenario and model parameters. We identify a set of five uncertain parameters, find their probability distributions (the posterior or inverse problem) given the calibration data by applying Bayes’ Theorem, and employ a local linearization to marginalize the nuisance parameters resulting from errors-in-variables. We then apply the instrument model to a new scenario that is the intended use of the instrument, a 1.5 MW coal-fired furnace. Unlike standard error propagation, this Bayesian method infers values for the five uncertain parameters by sampling from the posterior distribution and then computing the intensity with quantifiable uncertainty at the point of a new, in-situ furnace measurement (the posterior predictive or forward problem). Given the instrument-model context of this analysis, the propagated uncertainty provides a significant proportion of the measurement error for each in-situ furnace measurement. With this approach, we produce uncertainties at each temporal measurement of the radiative intensity in the furnace, successfully identifying temporal variations that were otherwise indistinguishable from measurement uncertainty.
Objective: In 2018, the Network of the National Libraries of Medicine (NNLM) launched a national sponsorship program to support U.S. public library staff in completing the Medical Library Association’s (MLA) Consumer Health Information Specialization (CHIS). The primary objective of this research project was to determine if completion of the sponsored specialization was successful in improving public library staff ability to provide consumer health information and whether it resulted in new services, programming, or outreach activities at public libraries. Secondary objectives of this research were to determine motivation for and benefits of the specialization and to determine the impact on sponsorship on obtaining and continuing the specialization.
Methods: To evaluate the sponsorship program, we developed and administered a 16-question online survey via REDCap in August 2019 to 224 public library staff that were sponsored during the first year of the program. We measured confidence and competence in providing consumer health information using questions aligned with the eight Core Competencies for Providing Consumer Health Information Services [1]. Additionally, the survey included questions about new consumer health information activities at public libraries, public library staff motivation to obtain the specialization, and whether it led to immediate career gains. To determine the overall value of the NNLM sponsorship, we measured whether funding made it more likely for participants to complete or continue the specialization.
Results: Overall, 136 participants (61%) responded to the survey. Our findings indicated that the program was a success: over 80% of participants reported an increase in core consumer health competencies, with a statistically significant improvement in mean competency scores after completing the specialization. Ninety percent of participants have continued their engagement with NNLM, and over half offered new health information programs and services at their public library. All respondents indicated that completing the specialization met their expectations, but few reported immediate career gains. While over half of participants planned to renew the specialization or obtain the more advanced, Level II specialization, 72% indicated they would not continue without the NNLM sponsorship.
Conclusion: Findings indicate that NNLM sponsorship of the CHIS specialization was successful in increasing the ability of public library staff to provide health information to their community. and This dataset represents the de-identified raw results of a 16-question, online survey (via REDCap) collected in August 2019 to 224 public library staff who were sponsored for a Consumer Health Information Specialization (CHIS). The purpose of the study was to determine whether the sponsorship program had an impact on public library staff to provide consumer health information.
This study investigates impacts of altering subgrid-scale mixing in “convection-permitting” km-scale horizontal grid spacing (∆h) simulations by applying either constant or stochastic multiplicative factors to the horizontal mixing coefficients within the Weather Research and Forecasting model. In quasi-idealized 1-km ∆h simulations of two observationally based squall line cases, constant enhanced mixing produces larger updraft cores that are more dilute at upper levels, weakens the cold pool, rear inflow jet, and front-to-rear flow of the squall line, and degrades the model’s effective resolution. Reducing mixing by a constant multiplicative factor has the opposite effect on all metrics. Completely turning off parameterized horizontal mixing produces bulk updraft statistics and squall line mesoscale structure closest to a LES “benchmark” among all 1-km simulations, although the updraft cores are too undilute. The stochastic mixing scheme, which applies a multiplicative factor to the mixing coefficients that varies stochastically in time and space, is employed at 0.5-, 1-, and 2-km ∆h. It generally reduces mid-level vertical velocities and enhances upper-level vertical velocities compared to simulations using the standard mixing scheme, with more substantial impacts at 1-km and 2-km ∆h compared to 0.5-km. The stochastic scheme also increases updraft dilution to better agree with the LES for one case, but has less impact on the other case. Stochastic mixing acts to weaken the cold pool but without a significant impact on squall line propagation. It also does not affect the model’s overall effective resolution unlike applying constant multiplicative factors to the mixing coefficients.
We consider a scenario where the small satellites of Pluto and Charon grew within a disk of debris from an impact between Charon and a trans-Neptunian object (TNO). After Charon's orbital motion boosts the debris into a disk-like structure, rapid orbital damping of meter-sized or smaller objects is essential to prevent the subsequent reaccretion or dynamical ejection by the binary. From analytical estimates and simulations of disk evolution, we estimate an impactor radius of 30-100 km; smaller (larger) radii apply to an oblique (direct) impact. Although collisions between large TNOs and Charon are unlikely today, they were relatively common within the first 0.1-1 Gyr of the solar system. Compared to models where the small satellites agglomerate in the debris left over by the giant impact that produced the Pluto-Charon binary planet, satellite formation from a later impact on Charon avoids the destabilizing resonances that sweep past the satellites during the early orbital expansion of the binary.
Ground-based measurements of frozen precipitation are heavily influenced by interactions of surface winds with gauge-shield geometry. The Multi-Angle Snowflake Camera (MASC), which photographs hydrometeors in free-fall from three different angles while simultaneously measuring their fall speed, has been used in the field at multiple mid-latitude and polar locations both with and without wind shielding. Here we present an analysis of Arctic field observations — with and without a Belfort double Alter shield — and compare the results to computational fluid dynamics (CFD) simulations of the airflow and corresponding particle trajectories around the unshielded MASC. MASC-measured fall speeds compare well with Ka-band Atmospheric Radiation Measurement (ARM) Zenith Radar (KAZR) mean Doppler velocities only when winds are light (< 5 m/s) and the MASC is shielded. MASC-measured fall speeds that do not match KAZR measured velocities tend to fall below a threshold value that increases approximately linearly with wind speed but is generally < 0.5 m/s. For those events with wind speeds < 1.5 m/s, hydrometeors fall with an orientation angle mode of 12 degrees from the horizontal plane, and large, low-density aggregates are as much as five times more likely to be observed. Simulations in the absence of a wind shield show a separation of flow at the upstream side of the instrument, with an upward velocity component just above the aperture, which decreases the mean particle fall speed by 55% (74%) for a wind speed of 5 m/s (10 m/s). We conclude that accurate MASC observations of the microphysical, orientation, and fall speed characteristics of snow particles require shielding by a double wind fence and restriction of analysis to events where winds are light (< 5 m/s). Hydrometeors do not generally fall in still air, so adjustments to these properties' distributions within natural turbulence remain to be determined.
We consider a scenario where the small satellites of Pluto and Charon grew within a disk of debris from an impact between Charon and a trans-Neptunian Object (TNO). After Charon’s orbital motion boosts the debris into a disk-like structure, rapid orbital damping of meter-size or smaller objects is essential to prevent the subsequent re-accretion or dynamical ejection by the binary. From analytical estimates and simulations of disk evolution, we estimate an impactor radius of 30–100 km; smaller (larger) radii apply to an oblique (direct) impact. Although collisions between large TNOs and Charon are unlikely today, they were relatively common within the first 0.1–1 Gyr of the solar system. Compared to models where the small satellites agglomerate in the debris left over by the giant impact that produced the Pluto-Charon binary planet, satellite formation from a later impact on Charon avoids the destabilizing resonances that sweep past the satellites during the early orbital expansion of the binary.
Thin boundary layer Arctic mixed-phase clouds are generally thought to precipitate pristine and aggregate ice crystals. Here we present automated surface photographic measurements showing that only 35\% of precipitation particles exhibit negligible riming and that graupel particles $\geq1\,\rm{mm}$ in diameter commonly fall from clouds with liquid water paths less than $50\,\rm{g\,m^{-2}}$. A simple analytical formulation predicts that significant riming enhancement can occur in updrafts with speeds typical of Arctic clouds, and observations show that such conditions are favored by weak temperature inversions and strong radiative cooling at cloud top. However, numerical simulations suggest that a mean updraft speed of $0.75\,\rm{m\,s^{-1}}$ would need to be sustained for over one hour. Graupel can efficiently remove moisture and aerosols from the boundary layer. The causes and impacts of Arctic riming enhancement remain to be determined.
The Andes Cordillera, which runs the length of South America and rises up to 5,000 m MSL within 200 km of the Pacific coast, dramatically influences the distribution of winter precipitation and snowpack over Chile and Argentina. The study of orographic precipitation processes, particularly along the western slopes of the Andes, is important to improve forecasts of severe flooding and snowpack in a region that depends on snowmelt for water resources. While orographic effects have been investigated on synoptic scales in the Andes, the lack of operational radar coverage and high-elevation, long-term precipitation records have, before the present study, precluded an in-depth investigation into the mesoscale and microphysical processes that affect the distribution of precipitation in the region.
This dataset was collected during the Chilean Orographic and Mesoscale Precipitation Study (ChOMPS), which, from May-October 2016, investigated the evolution of precipitation amounts, dropsize distribution, and the vertical profile of radar echoes along an east-west transect that stretched from the Pacific coast to the windward slope of the Andes. The transect, at ~36°S, was made up of a coastal site upstream of the coastal mountain range (Concepción), a central valley site (Chillán), and a mountain site (Las Trancas). Instrumentation along the transect included three vertically pointing Micro-Rain-Radars, two Parsivel Disdrometers, and several meteorological stations.
The dataset documents the evolution of Doppler velocity and reflectivity profiles with inland extent during early, middle, and late storm sectors. Additionally, the transect provides a season-long record of the inland evolution of melting layer height as well as the prevalence and structure of shallow non-brightband rain and the characteristics of its inland penetration to the central valley. This dataset, the first of its kind in the Chilean Andes, provides unique insight into mesoscale and orographic precipitation processes that also have applicability to the west coast of the United States and other mountainous regions.