ResearchPad - computing-methods https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Scedar: A scalable Python package for single-cell RNA-seq exploratory data analysis]]> https://www.researchpad.co/article/elastic_article_13837 In single-cell RNA-seq (scRNA-seq) experiments, the number of individual cells has increased exponentially, and the sequencing depth of each cell has decreased significantly. As a result, analyzing scRNA-seq data requires extensive considerations of program efficiency and method selection. In order to reduce the complexity of scRNA-seq data analysis, we present scedar, a scalable Python package for scRNA-seq exploratory data analysis. The package provides a convenient and reliable interface for performing visualization, imputation of gene dropouts, detection of rare transcriptomic profiles, and clustering on large-scale scRNA-seq datasets. The analytical methods are efficient, and they also do not assume that the data follow certain statistical distributions. The package is extensible and modular, which would facilitate the further development of functionalities for future requirements with the open-source development community. The scedar package is distributed under the terms of the MIT license at https://pypi.org/project/scedar.

]]>
<![CDATA[A deadline constrained scheduling algorithm for cloud computing system based on the driver of dynamic essential path]]> https://www.researchpad.co/article/5c8c195bd5eed0c484b4d4af

To solve the problem of the deadline-constrained task scheduling in the cloud computing system, this paper proposes a deadline-constrained scheduling algorithm for cloud computing based on the driver of dynamic essential path (Deadline-DDEP). According to the changes of the dynamic essential path of each task node in the scheduling process, the dynamic sub-deadline strategy is proposed. The strategy assigns different sub-deadline values to every task node to meet the constraint relations among task nodes and the user’s defined deadline. The strategy fully considers the dynamic sub-deadline affected by the dynamic essential path of task node in the scheduling process. The paper proposed the quality assessment of optimization cost strategy to solve the problem of selecting server for each task node. Based on the sub-deadline urgency and the relative execution cost in the scheduling process, the strategy selects the server that not only meets the sub-deadline but also obtains much lower execution cost. In this way, the proposed algorithm will make the task graph complete within its deadline, and minimize its total execution cost. Finally, we demonstrate the proposed algorithm via the simulation experiments using Matlab tools. The experimental results show that, the proposed algorithm produces remarkable performance improvement rate on the total execution cost that ranges between 10.3% and 30.8% under meeting the deadline constraint. In view of the experimental results, the proposed algorithm provides better-quality scheduling solution that is suitable for scientific application task execution in the cloud computing environment than IC-PCP, DCCP and CD-PCP.

]]>
<![CDATA[A metamorphic testing approach for event sequences]]> https://www.researchpad.co/article/5c75ac5fd5eed0c484d0865d

Test oracles are commonly used in software testing to determine the correctness of the execution results of test cases. However, the testing of many software systems faces the test oracle problem: a test oracle may not always be available, or it may be available but too expensive to apply. One such software system is a system involving abundant business processes. This paper focuses on the testing of business-process-based software systems and proposes a metamorphic testing approach for event sequences, called MTES, to alleviate the oracle problem. We utilized event sequences to represent business processes and then applied the technique of metamorphic testing to test the system without using test oracles. To apply metamorphic testing, we studied the general rules for identifying metamorphic relations for business processes and further demonstrated specific metamorphic relations for individual case studies. Three case studies were conducted to evaluate the effectiveness of our approach. The experimental results show that our approach is feasible and effective in testing the applications with rich business processes. In addition, this paper summarizes the experimental findings and proposes guidelines for selecting good metamorphic relations for business processes.

]]>
<![CDATA[A quadratic trigonometric spline for curve modeling]]> https://www.researchpad.co/article/5c40f762d5eed0c48438600c

An imperative curve modeling technique has been established with a view to its applications in various disciplines of science, engineering and design. It is a new spline method using piecewise quadratic trigonometric functions. It possesses error bounds of order 3. The proposed curve model also owns the most favorable geometric properties. The proposed spline method accomplishes C2 smoothness and produces a Quadratic Trigonometric Spline (QTS) with the view to its applications in curve design and control. It produces a C2 quadratic trigonometric alternative to the traditional cubic polynomial spline (CPS) because of having four control points in its piecewise description. The comparison analysis of QTS and CPS verifies the QTS as better alternate to CPS. Also, the time analysis proves QTS computationally efficient than CPS.

]]>
<![CDATA[Intervention on default contagion under partial information in a financial network]]> https://www.researchpad.co/article/5c478c36d5eed0c484bd0d24

We study the optimal interventions of a regulator (a central bank or government) on the illiquidity default contagion process in a large, heterogeneous, unsecured interbank lending market. The regulator has only partial information on the interbank connections and aims to minimize the fraction of final defaults with minimal interventions. We derive the analytical results of the asymptotic optimal intervention policy and the asymptotic magnitude of default contagion in terms of the network characteristics. We extend the results of Amini, Cont and Minca’s work to incorporate interventions and adopt the dynamics of Amini, Minca and Sulem’s model to build heterogeneous networks with degree sequences and initial equity levels drawn from arbitrary distributions. Our results generate insights that the optimal intervention policy is “monotonic” in terms of the intervention cost, the closeness to invulnerability and connectivity. The regulator should prioritize interventions on banks that are systematically important or close to invulnerability. Moreover, the regulator should keep intervening on a bank once having intervened on it. Our simulation results show a good agreement with the theoretical results.

]]>
<![CDATA[Comparison of visual assessment and computer image analysis of intracoronary thrombus type by optical coherence tomography]]> https://www.researchpad.co/article/5c2151bbd5eed0c4843fba32

Background

Analysis of intracoronary thrombus type by optical coherence tomography (OCT) imaging is highly subjective. We aimed to compare a newly developed image analysis method to subjective visual classification of thrombus type identified by OCT.

Methods

Thirty patients with acute ST elevation myocardial infarction were included. Thrombus type visually classified by two independent readers was compared with analysis using QCU-CMS software.

Results

Repeatability of the computer-based measurements was good. By using a ROC, area under curve values for discrimination of white and red thrombi were 0.92 (95% confidence intervals (CI) 0.83–1.00) for median attenuation, 0.96 (95% CI 0.89–1.00) for mean backscatter and 0.96 (95% CI 0.89–1.00) for mean grayscale intensity. Median attenuation of 0.57 mm-1 (sensitivity 100%, specificity 71%), mean backscatter of 5.35 (sensitivity 92%, specificity 94%) and mean grayscale intensity of 120.1 (sensitivity 85%, specificity 100%) were identified as the best cut-off values to differentiate between red and white thrombi.

Conclusions

Attenuation, backscatter and grayscale intensity of thrombi in OCT images differentiated red and white thrombi with high sensitivity and specificity. Measurement of these continuous parameters can be used as a less user-dependent method to characterize in vivo thrombi. The clinical significance of these findings needs to be tested in further studies.

]]>
<![CDATA[A searchable personal health records framework with fine-grained access control in cloud-fog computing]]> https://www.researchpad.co/article/5c2400a1d5eed0c484098466

Fog computing can extend cloud computing to the edge of the network so as to reduce latency and network congestion. However, existing encryption schemes were rarely used in fog environment, resulting in high computational and storage overhead. Aiming at the demands of local information for terminal device and the shortcomings of cloud computing framework in supporting mobile applications, by taking the hospital scene as an example, a searchable personal health records framework with fine-grained access control in cloud-fog computing is proposed. The proposed framework combines the attribute-based encryption (ABE) technology and search encryption (SE) technology to implement keyword search function and fine-grained access control ability. When keyword index and trapdoor match are successful, the cloud server provider only returns relevant search results to the user, thus achieving a more accurate search. At the same time, the scheme is multi-authority, and the key leakage problem is solved by dividing the user secret key distribution task. Moreover, in the proposed scheme, we securely outsource part of the encryption and decryption operations to the fog node. It is effective both in local resources and in resource-constrained mobile devices. Based on the decisional q-parallel bilinear Diffie-Hellman exponent (q-DBDHE) assumption and decisional bilinear Diffie-Hellman (DBDH) assumption, our scheme is proven to be secure. Simulation experiments show that our scheme is efficient in the cloud-fog environment.

]]>
<![CDATA[Machine learning in population health: Opportunities and threats]]> https://www.researchpad.co/article/5c06f02fd5eed0c484c6d308

Abraham D. Flaxman and Theo Vos of the Institute for Health Metrics and Evaluation, University of Washington, discuss near-term applications for ML in population health and name their priorities for ongoing ML development.

]]>
<![CDATA[Inhibition of Plasmepsin V Activity Demonstrates Its Essential Role in Protein Export, PfEMP1 Display, and Survival of Malaria Parasites]]> https://www.researchpad.co/article/5989da9aab0ee8fa60ba33c7

A small molecule inhibitor of the malarial protease Plasmepsin V impairs protein export and cellular remodeling, reducing parasite survival in human erythrocytes.

]]>
<![CDATA[Efficient Unrestricted Identity-Based Aggregate Signature Scheme]]> https://www.researchpad.co/article/5989db3aab0ee8fa60bd45a1

An aggregate signature scheme allows anyone to compress multiple individual signatures from various users into a single compact signature. The main objective of such a scheme is to reduce the costs on storage, communication and computation. However, among existing aggregate signature schemes in the identity-based setting, some of them fail to achieve constant-length aggregate signature or require a large amount of pairing operations which grows linearly with the number of signers, while others have some limitations on the aggregated signatures. The main challenge in building efficient aggregate signature scheme is to compress signatures into a compact, constant-length signature without any restriction. To address the above drawbacks, by using the bilinear pairings, we propose an efficient unrestricted identity-based aggregate signature. Our scheme achieves both full aggregation and constant pairing computation. We prove that our scheme has existential unforgeability under the computational Diffie-Hellman assumption.

]]>
<![CDATA[The Predictive Performance and Stability of Six Species Distribution Models]]> https://www.researchpad.co/article/5989da6dab0ee8fa60b936ad

Background

Predicting species’ potential geographical range by species distribution models (SDMs) is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs.

Methodology

We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis) and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials). We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values.

Results

The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (p<0.05), while the associated standard deviations and coefficients of variation were larger for BIOCLIM and DOMAIN trials (p<0.05), and the 99% confidence intervals for AUC and Kappa values were narrower for MAHAL, RF, MAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM) had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points).

Conclusions

According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

]]>
<![CDATA[Bagging Statistical Network Inference from Large-Scale Gene Expression Data]]> https://www.researchpad.co/article/5989da0aab0ee8fa60b7743d

Modern biology and medicine aim at hunting molecular and cellular causes of biological functions and diseases. Gene regulatory networks (GRN) inferred from gene expression data are considered an important aid for this research by providing a map of molecular interactions. Hence, GRNs have the potential enabling and enhancing basic as well as applied research in the life sciences. In this paper, we introduce a new method called BC3NET for inferring causal gene regulatory networks from large-scale gene expression data. BC3NET is an ensemble method that is based on bagging the C3NET algorithm, which means it corresponds to a Bayesian approach with noninformative priors. In this study we demonstrate for a variety of simulated and biological gene expression data from S. cerevisiae that BC3NET is an important enhancement over other inference methods that is capable of capturing biochemical interactions from transcription regulation and protein-protein interaction sensibly. An implementation of BC3NET is freely available as an R package from the CRAN repository.

]]>
<![CDATA[An Ensemble Classifier for Eukaryotic Protein Subcellular Location Prediction Using Gene Ontology Categories and Amino Acid Hydrophobicity]]> https://www.researchpad.co/article/5989d9f8ab0ee8fa60b70d36

With the rapid increase of protein sequences in the post-genomic age, it is challenging to develop accurate and automated methods for reliably and quickly predicting their subcellular localizations. Till now, many efforts have been tried, but most of which used only a single algorithm. In this paper, we proposed an ensemble classifier of KNN (k-nearest neighbor) and SVM (support vector machine) algorithms to predict the subcellular localization of eukaryotic proteins based on a voting system. The overall prediction accuracies by the one-versus-one strategy are 78.17%, 89.94% and 75.55% for three benchmark datasets of eukaryotic proteins. The improved prediction accuracies reveal that GO annotations and hydrophobicity of amino acids help to predict subcellular locations of eukaryotic proteins.

]]>
<![CDATA[Transcriptomic Analysis of Human Retinal Detachment Reveals Both Inflammatory Response and Photoreceptor Death]]> https://www.researchpad.co/article/5989da25ab0ee8fa60b80628

Background

Retinal detachment often leads to a severe and permanent loss of vision and its therapeutic management remains to this day exclusively surgical. We have used surgical specimens to perform a differential analysis of the transcriptome of human retinal tissues following detachment in order to identify new potential pharmacological targets that could be used in combination with surgery to further improve final outcome.

Methodology/Principal Findings

Statistical analysis reveals major involvement of the immune response in the disease. Interestingly, using a novel approach relying on coordinated expression, the interindividual variation was monitored to unravel a second crucial aspect of the pathological process: the death of photoreceptor cells. Within the genes identified, the expression of the major histocompatibility complex I gene HLA-C enables diagnosis of the disease, while PKD2L1 and SLCO4A1 -which are both down-regulated- act synergistically to provide an estimate of the duration of the retinal detachment process. Our analysis thus reveals the two complementary cellular and molecular aspects linked to retinal detachment: an immune response and the degeneration of photoreceptor cells. We also reveal that the human specimens have a higher clinical value as compared to artificial models that point to IL6 and oxidative stress, not implicated in the surgical specimens studied here.

Conclusions/Significance

This systematic analysis confirmed the occurrence of both neurodegeneration and inflammation during retinal detachment, and further identifies precisely the modification of expression of the different genes implicated in these two phenomena. Our data henceforth give a new insight into the disease process and provide a rationale for therapeutic strategies aimed at limiting inflammation and photoreceptor damage associated with retinal detachment and, in turn, improving visual prognosis after retinal surgery.

]]>
<![CDATA[Perceiving Object Shape from Specular Highlight Deformation, Boundary Contour Deformation, and Active Haptic Manipulation]]> https://www.researchpad.co/article/5989da8dab0ee8fa60b9e96b

It is well known that motion facilitates the visual perception of solid object shape, particularly when surface texture or other identifiable features (e.g., corners) are present. Conventional models of structure-from-motion require the presence of texture or identifiable object features in order to recover 3-D structure. Is the facilitation in 3-D shape perception similar in magnitude when surface texture is absent? On any given trial in the current experiments, participants were presented with a single randomly-selected solid object (bell pepper or randomly-shaped “glaven”) for 12 seconds and were required to indicate which of 12 (for bell peppers) or 8 (for glavens) simultaneously visible objects possessed the same shape. The initial single object’s shape was defined either by boundary contours alone (i.e., presented as a silhouette), specular highlights alone, specular highlights combined with boundary contours, or texture. In addition, there was a haptic condition: in this condition, the participants haptically explored with both hands (but could not see) the initial single object for 12 seconds; they then performed the same shape-matching task used in the visual conditions. For both the visual and haptic conditions, motion (rotation in depth or active object manipulation) was present in half of the trials and was not present for the remaining trials. The effect of motion was quantitatively similar for all of the visual and haptic conditions–e.g., the participants’ performance in Experiment 1 was 93.5 percent higher in the motion or active haptic manipulation conditions (when compared to the static conditions). The current results demonstrate that deforming specular highlights or boundary contours facilitate 3-D shape perception as much as the motion of objects that possess texture. The current results also indicate that the improvement with motion that occurs for haptics is similar in magnitude to that which occurs for vision.

]]>
<![CDATA[Model Based Predictive Control of Multivariable Hammerstein Processes with Fuzzy Logic Hypercube Interpolated Models]]> https://www.researchpad.co/article/5989daadab0ee8fa60baa1c0

This paper introduces the Fuzzy Logic Hypercube Interpolator (FLHI) and demonstrates applications in control of multiple-input single-output (MISO) and multiple-input multiple-output (MIMO) processes with Hammerstein nonlinearities. FLHI consists of a Takagi-Sugeno fuzzy inference system where membership functions act as kernel functions of an interpolator. Conjunction of membership functions in an unitary hypercube space enables multivariable interpolation of N-dimensions. Membership functions act as interpolation kernels, such that choice of membership functions determines interpolation characteristics, allowing FLHI to behave as a nearest-neighbor, linear, cubic, spline or Lanczos interpolator, to name a few. The proposed interpolator is presented as a solution to the modeling problem of static nonlinearities since it is capable of modeling both a function and its inverse function. Three study cases from literature are presented, a single-input single-output (SISO) system, a MISO and a MIMO system. Good results are obtained regarding performance metrics such as set-point tracking, control variation and robustness. Results demonstrate applicability of the proposed method in modeling Hammerstein nonlinearities and their inverse functions for implementation of an output compensator with Model Based Predictive Control (MBPC), in particular Dynamic Matrix Control (DMC).

]]>
<![CDATA[Three-Dimensional Flow of an Oldroyd-B Nanofluid towards Stretching Surface with Heat Generation/Absorption]]> https://www.researchpad.co/article/5989db27ab0ee8fa60bd06dd

This article addresses the steady three-dimensional flow of an Oldroyd-B nanofluid over a bidirectional stretching surface with heat generation/absorption effects. Suitable similarity transformations are employed to reduce the governing partial differential equations into coupled nonlinear ordinary differential equations. These nonlinear ordinary differential equations are then solved analytically by using the homotpy analysis method (HAM). Graphically results are presented and discussed for various parameters, namely, Deborah numbers and , heat generation/absorption parameter Prandtl parameter , Brownian motion parameters, thermophoresis parameter and Lewis number . We have seen that the increasing values of the Brownian motion parameter and thermophoresis parameter leads to an increase in the temperature field and thermal boundary layer thickness while the opposite behavior is observed for concentration field and concentration boundary layer thickness. To see the validity of the present work, the numerical results are compared with the analytical solutions obtained by Homotopy analysis method and noted an excellent agreement for the limiting cases.

]]>
<![CDATA[Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services]]> https://www.researchpad.co/article/5989db40ab0ee8fa60bd664f

Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.

]]>
<![CDATA[Application of Differential Evolution Algorithm on Self-Potential Data]]> https://www.researchpad.co/article/5989d9d2ab0ee8fa60b647b8

Differential evolution (DE) is a population based evolutionary algorithm widely used for solving multidimensional global optimization problems over continuous spaces, and has been successfully used to solve several kinds of problems. In this paper, differential evolution is used for quantitative interpretation of self-potential data in geophysics. Six parameters are estimated including the electrical dipole moment, the depth of the source, the distance from the origin, the polarization angle and the regional coefficients. This study considers three kinds of data from Turkey: noise-free data, contaminated synthetic data, and Field example. The differential evolution and the corresponding model parameters are constructed as regards the number of the generations. Then, we show the vibration of the parameters at the vicinity of the low misfit area. Moreover, we show how the frequency distribution of each parameter is related to the number of the DE iteration. Experimental results show the DE can be used for solving the quantitative interpretation of self-potential data efficiently compared with previous methods.

]]>
<![CDATA[Comprehensive measurement of UVB-induced non-melanoma skin cancer burden in mice using photographic images as a substitute for the caliper method]]> https://www.researchpad.co/article/5989db4fab0ee8fa60bdb966

The vernier caliper has been used as a gold standard to measure the length, width and height of skin tumors to calculate their total area and volume. It is a simple method for collecting data on a few tumors at a time, but becomes tedious, time-consuming and stressful for the animals and the operator when used for measuring multiple tumors in a large number of animals in protocols such as UVB-induced non-melanoma skin cancer (NMSC) in SKH-1 mice. Here, we show that photographic images of these mice taken within a few minutes under optimized conditions can be subjected to computerized analyses to determine tumor volume and area as accurately and precisely as the caliper method. Unlike the caliper method, the photographic method also records the incidence and multiplicity of tumors, thus permitting comprehensive measurement of tumor burden in the animal. The simplicity and ease of this method will permit more frequent monitoring of tumor burden in long protocols, resulting in the creation of additional data about dynamic changes in progression of cancer or the efficacy of therapeutic intervention. The photographic method can broadly substitute the caliper method for quantifying other skin pathologies.

]]>