med-mastodon.com is one of the many independent Mastodon servers you can use to participate in the fediverse.
Medical community on Mastodon

Administered by:

Server stats:

370
active users

#xcms

1 post1 participant0 posts today

We just added a #reproducible example for large-scale #metabolomics data preprocessing using the @bioconductor #xcms package to @phili 's #Metabonaut tutorials resource! 💪

rformassspectrometry.github.io

runs also on a standard 💻 - thanks to the new memory-saving processing in #xcms

using the largest (n=4000) data set from MetaboLights

rformassspectrometry.github.ioLarge Scale Data Preprocessing with xcms

Yes, we're still working on improving the #xcms @bioconductor :rstats: package for LC-MS #metabolomics data preprocessing!

This time: filter features based on quality criteria from @davidbroadhurst et al. (doi.org/10.1007/s11306-018-136).

great contribution from @phili !

More info: bit.ly/3UqmfSx

Next on the list: mzTab-M export.

SpringerLinkGuidelines and considerations for the use of system suitability and quality control samples in mass spectrometry assays applied in untargeted clinical metabolomic studies - MetabolomicsBackground Quality assurance (QA) and quality control (QC) are two quality management processes that are integral to the success of metabolomics including their application for the acquisition of high quality data in any high-throughput analytical chemistry laboratory. QA defines all the planned and systematic activities implemented before samples are collected, to provide confidence that a subsequent analytical process will fulfil predetermined requirements for quality. QC can be defined as the operational techniques and activities used to measure and report these quality requirements after data acquisition. Aim of review This tutorial review will guide the reader through the use of system suitability and QC samples, why these samples should be applied and how the quality of data can be reported. Key scientific concepts of review System suitability samples are applied to assess the operation and lack of contamination of the analytical platform prior to sample analysis. Isotopically-labelled internal standards are applied to assess system stability for each sample analysed. Pooled QC samples are applied to condition the analytical platform, perform intra-study reproducibility measurements (QC) and to correct mathematically for systematic errors. Standard reference materials and long-term reference QC samples are applied for inter-study and inter-laboratory assessment of data.

I spent SO MANY HOURS in grad school manually adjusting peak baselines and deciding what should or shouldn't count as a peak and then going back and doing it all again because I called it a peak in one sample but not the other. I'm SO happy that progress is being made on this problem and very excited to read this preprint!

biorxiv.org/content/10.1101/20

#GCMS #LCMS #XCMS #AnalyticalChemistry #Chromatography
#Metabolomics

bioRxivPicky with peakpicking: assessing chromatographic peak quality with simple metrics in metabolomicsChromatographic peakpicking continues to represent a significant bottleneck in automated LC-MS workflows. Uncontrolled false discovery rates and the lack of manually-calibrated quality metrics require researchers to visually evaluate individual peaks, requiring large amounts of time and breaking replicability. This problem is exacerbated in noisy environmental datasets and for novel separation methods such as hydrophilic interaction columns in metabolomics, creating a demand for a simple, intuitive, and robust metric of peak quality. Here, we manually labeled four HILIC oceanographic particulate metabolite datasets to assess the performance of individual peak quality metrics. We used these datasets to construct a predictive model calibrated to the likelihood that visual inspection by an MS expert would include a given mass feature in the downstream analysis. We implemented two novel peak quality metrics, a custom signal-to-noise metric and a test of similarity to a bell curve, both calculated from the raw data in the extracted ion chromatogram and found that these outperformed existing measurements of peak quality. A simple logistic regression model built on two metrics reduced the fraction of false positives in the analysis from 70-80% down to 1-5% and showed minimal overfitting when applied to novel datasets. We then explored the implications of this quality thresholding on the conclusions obtained by the downstream analysis and found that while only 10% of the variance in the dataset could be explained by depth in the default output from the peakpicker, approximately 40% of the variance was explained when restricted to high-quality peaks alone. We conclude that the poor performance of peakpicking algorithms significantly reduces the power of both univariate and multivariate statistical analyses to detect environmental differences. We demonstrate that simple models built on intuitive metrics and derived from the raw data are more robust and can outperform more complex models when applied to new data. Finally, we show that in properly curated datasets, depth is a major driver of variability in the marine microbial metabolome and identify several interesting metabolite trends for future investigation. ### Competing Interest Statement The authors have declared no competing interest.