Tumgik
#mdss (ad)
piosplayhouse · 8 months
Text
In honor of the newest Japanese ad mdss art being the Yunmeng twin prides let us all remember the most unfortunately hilarious official mdzs art ever made: this Yunmeng twin prides art that made it look like wwx was absolutely railing a crying jc doggystyle
Tumblr media
153 notes · View notes
netnahas · 2 years
Text
Nexus 5x tools for screen replacement
Tumblr media
#Nexus 5x tools for screen replacement update
#Nexus 5x tools for screen replacement driver
#Nexus 5x tools for screen replacement android
#Nexus 5x tools for screen replacement update
Further fixes has been added later (in SDHC1, PSCI and cleanup in 5.9, overlay msm8994 as hardware is very similar and update regulator config in 5.12, overlay and PSCI broke booting, which got fixed in 5.14. In kernel 4.18 was added support for SDHCI1 and pstore-ramoops. The Linux mainline kernel has very basic support for this phone since November 2016 (one cpu and uart). Here is the official Google page to download GPS, Audio, Camera, Gestures, Graphics, DRM, Video, Sensors firware blob. Driver exists in mainline.įPC1020, supported by this driver.
#Nexus 5x tools for screen replacement driver
Mainline driver since 4.17, properly enabled since 5.9 – c83e0951bcad ("arm64: dts: qcom: msm8992: Fix SDHCI1") Out-of-tree driver generated using linux-mdss-dsi-panel-driver-generator. Uses Qualcomm's PM8994 controller, works since 4.11 The current mainline Linux only supports Nexus 5X rev 1.01, but upcoming 5.18 will contain also rev 1.0 – cd4bd4704ec8 ("arm64: dts: qcom: msm8992-lg-bullhead: Add support for LG Bullhead rev 1.0"). Reboot again if you don't see anything in the screen after postmarketOS logo Now select start from the menu in the bootloader screen. $ pmbootstrap flasher flash_rootfs -partition userdata Go to bootloader then connect the device to the PC then run this commands one by one. Launch recovery: first boot to the bootloader, press the Volume Down button twice and press the Power button to select It will re-lock every reboot.īoot to bootloader: hold Volume Vown + Power button until the screen turns on. If this is missing then oem-unlocking won't work on the device. If your device says "SECURE BOOT: ENABLED (NO RPMB)" in the fastboot screen that means that the mainboard in your Nexus 5X is missing an image required by secureboot to function properly. Now select Yes on the phone screen with the volume button and use the power button to accept. Reboot to the bootloader and run " fastboot oem unlock" Then go to Settings→ System→ Developer Options and enable OEM unlocking
#Nexus 5x tools for screen replacement android
In android go to Settings→ System→ About Phone and tap the build number 7 times. UnDevDeCatOS ( Notes: Main device, LineageOS 15.1).Nobodywasishere ( Notes: I have 7 of these with half having bootlooping issues).Kcroot ( Notes: I have 4 pieces, started build).IonAgorria ( Notes: LineageOS - Boots but affected with big.LITTLE core cluster issue).Chappo ( Notes: MaruOS Installed, Daily Driver, Would prefer postmarketOS).
Tumblr media
0 notes
yingfa89 · 2 years
Text
Selling danmei Merch - Round 3
Edit: added new items!
Hi! I have decided to sell some MXTX merch that I will not be keeping. These items need a new home!
Everything is in good condition. Note that most items have been taken out of their original packaging.
If you are interested in buying, please PM me about it. Please do not send me anonymous messages if you are serious about buying.
Payment options: E-transfer or Paypal in CAD.
E-transfer will only work if you are in Canada. I am not responsible for any extra fees Paypal incurs.
Location: Canada
Shipping options: Canada Post (UPS is possible if you want it)
Price: Cost of item + shipping cost to you (to be determined)
The cost of item is based on initial cost plus what I had to pay to get the item to me, more or less.
From the moment I receive the payment from the buyer, I will do my best in sending out the item within one week. I will tell you if I cannot respect that timeframe.
Items were originally purchased through the group order @shandian-go or Amazon Japan or Asia Gou (Twitter).
Please don't hesitate in asking any questions on the items!
Items to sell: (Shipping to be determined)
From @shandian-go​
MDZS x Xing Yun Shi [Bunny Necklace - LWJ Regular]: 35
MDZS x Xing Yun Shi [Bunny Necklace - WWX Regular] : 35
CQL [Booklet Set - A] : 10
CQL [Booklet Set - B] : 10
MDZS x Xing Yun Shi [Youth Mug - LWJ] : 17
MDZS x Xing Yun Shi [Youth Mug - WWX] : 17
(The mugs have not been used!)
TGCF x Miao Wu Silhouette pin : 10
(See old posts for pictures)
MDZS X Miniso - Lanyard set : 6 SOLD
XQL Flower fan Lotus : 24 SOLD
CQL Scarf Purple : 15  SOLD
CQL Scarf blue : 15  SOLD
MDZS X KAZE Ghost festival Standee : 15
TGCF x Minidoll - Travel Pillow  Xie Lian or Hua Cheng: 30 each
MDZS X AIMON Cardholder WWX or LWJ or LSZ: 15 each
MDZS Quing Cang Wangxian Mooncake Figurine set: 65
From Amazon Japan
MDSS/MDZS Audio Drama S1 Part 1 (JP) : 50
MDSS/MDZS Audio Drama S1 Part 2 (JP) : 80
(Please note that the box for Part 2 is a bit broken bc Amazon JP just packed it in a measly envelop with no padding...CDs are intact!)
From asiangou
CQL - WWX 2021 Birthday Standee : 25 SOLD
CQL light stick : 10
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
14 notes · View notes
wei-yiing · 3 years
Text
Tumblr media
MDSS AD EP 3 LETS GET IT
the art for this EPISODE MAKES ME WANT TO EXPLODE omg LOOK AT THEM look at LWJ oh my FUCK i love gearous... so much...
i know i said it already but i love this jin ling. he doesn't sound as bratty or stuffy or nasally, ayumu murase's voice is too clear, but the haughtiness is definitely still prevalent and the clarity of his voice is actually really nice for jin ling??
aw wait:
gi musen, riding towards the temple on ringo-chan: hey, you brats!
ran keigi (ljy): who are you calling brats?! don't you know what sect we're from? just because you took off that funeral make up, don't think you can start acting like a senpai!
gi musen: right, right, right. then, "onii-san-tachi." fire your rescue signal...
wwx calling the juniors "onii-san-tachi" (as if he's addressing people senior to him) is actually adorable. 🥺
omg wwx is playing his shoddy makeshift bamboo flute and it's as terrible as you'd hope lmao
GUYS HE'S PLAYING WANGXIAN ITS HAPPENING
omg i think that was lwj. you could hear him like gasping for air as he ran towards wwx. OMG HE JUST SAID "KIMI GA?" ("is it you?") he's so reactive you can hear the hope in his voice oh my heart
wait this lwj is actually so good.
gi musen: my type? hm, i guess it would be someone like gankou-kun (hanguang-jun).
ran bouki: "itta na." ("you said it.")
the vibes are kinda like 'you actually said it.' or 'you said it, not me.' or even "you asked for it". you guys need to listen to this pjgjdhd i'm going insane. literally it only takes two words from lwj for me to go insane.
83 notes · View notes
Text
Kidney Disease Predictor Based on Medical Decision Support System
Tumblr media
Abstract
Renal failure will increase mortality if untreated. When kidneys fail, the buildup of toxins occurs. Which affects the whole body and cause complications. There are numerous causes for renal failure, but we evaluate its main causes, which are Hypertension, Diabetes, Glomerulo Nephritis, Vesicoureteral Reflux and Polycystic Kidney Disease. Kidneys, in general, are very complicated organs. Nevertheless, most kidney diseases share a lot of presenting symptoms, which may lead to some delay in medical diagnosis. This study aims to develop a decision support system to predict the main cause of renal failure in patients using their memory, making quick predictions that may aid in the final diagnosis. A multilayer perceptron (MLP) feed-forward neural network was proposed in this research. The input layer of the proposed system included 32 input variables. An iterative process was used to determine the number of neurons and hidden layers. Furthermore, a resilient backpropagation algorithm (Rprop) was used to train the system. In order to access the generalization of the proposed system, a 10-fold cross-validation scheme was used. We obtained an encouraging result for prediction patient from the experiments made on the data that were taken from 180 patients’ medical records at seven hospitals in Jordan.
Keywords: Kidney Disease; Kidney Failure; Prediction of Kidney failure; Clinical Decision Support System
Introduction
According to the Hashemite Kingdom of Jordan Ministry of Health Focal Point for Health Information, and according to the national list for local renal failure that there is about new 300 yearly cases are added to this list in Jordan (Prime Ministry). Significant lifesaving can be achieved if an accurate diagnosis can be made for patients suffering from various kidney diseases. And because kidney diseases symptoms can be similar, an accurate diagnosis cannot be an easy task. The kidneys are vital to major organs to keep the balance in the whole body, so talking about renal failure will implicate talking about the more systemic effect on the whole body with major resultant systemic complications too. Artificial neural network (ANN) field has gained its momentum in almost any domain of research and just recently has become a reliable tool in the medical domain [1-5]. ANN can help in solving diagnostic and prognostic problems in a variety of medical domains, by providing useful methods, techniques and tools. It is well suited to specialized hospitals and clinics, because of many new cases entered daily. With the data, symptoms and diagnosis are added, ANN could be applied on that data to help in the prediction of disease progression, the extraction of medical knowledge for outcomes research, for therapy planning and support, and overall patient management. It can be used for data analysis, such as detection of regularities in the data by appropriately dealing with imperfect data, interpretation of continuous data used in the Intensive Care Unit, and for intelligent alarming resulting in effective and efficient monitoring. ANN systems are very successful in the healthcare environment, due to its enhancement of medical experts work and improvement of the efficiency and quality of medical care. To reduce the diagnosis time and to improve its accuracy, a powerful medical decision support system (MDSS) has been developed. A Multilayer Perceptron (MLP) Feed-Forward Neural Network is used in developing the system to diagnosis the six main renal failure cause diseases. Multiple experiments are done with various inputs between 30 -32 input variables. Whereas, the output layer contains one neuron, which represents one disease causes based on the patient case. In order to access the generalization of the proposed system, a 10-fold cross-validation scheme is used. t. The data was taken from 180 patients’ questionnaires, who are suffering from renal failure because of one of the six causes. These questionnaires are collected from seven different hospitals in The Hashemite Kingdom of Jordan. As it is known, the medical diagnosis by nature is a complex and fuzzy cognitive process, and soft computing methods such as neural network have been widely used in solving these medical problems. In Self Organization Maps for prediction of kidney dysfunction by Ali [6], in this paper, he used the Kohonen- SOM network as a prediction for kidney dysfunction. The peculiar about Kohonen networks is that they are consist of two layers, input and output layer. The output layer can be two-diminutions. This system works as follows, first, it initializes the input nodes, the output nodes and connection weights. Then describe each set-in order (N coordination). After that, it computes the distance of all nodes. Finally, find the winning distance which will be the minimum one. All these steps will be according to specific mathematic calculations that were added to that paper. In a multilayer perceptron – based medical decision support system for heart disease diagnosis by Qatawneh et al. [2] they used a neural network to develop a medical decision support system to support the diagnosis Venous Thromboembolism Risk Classification, Applied Computing and Informatics. The computational model in this paper based on multilayer perceptron network. This model is used consist of 3 layers: an input layer, a hidden layer, and output layer. The input layer takes 40 variables, the number of nodes in the hidden layer is determined through the cascade learning process and with an output of 5 nodes corresponding to the heart diseases. This system is applied to a large number of patient cases, that prove at the end that the system has a strong capability to classify the 5 heart diseases with >90 accuracies. Another paper is Using Artificial Neural Network to Predict Cirrhosis in Patients with Chronic Hepatitis B Infection with Seven Routine Laboratory Findings by Vahdani [7]. The data on this system was obtained by taking specific tests and liver biopsies from all patients. According to that, liver diseases was obtained. Backpropagation and ANN analysis were used to train the data. In this model, there were 8 neurons for input, 15 neurons in the middle, and 1 neuron for output. The important thing about this paper is that the data were divided into 2 groups training and testing with two thirds and one third for them respectively. And multiple logistic regression models are applied to the training group and performed on the test group to allow prediction. Turkoglu et al. [8] presented an expert diagnosis system for the interpretation of the Doppler signals of the heart valve diseases using a back-propagation neural network. The test results showed that this system was effective to detect Doppler heart sounds. The correct classification rate was about 94% for normal subjects and 95.9% for abnormal subjects. All these studies depend on specific tests that the patient goes throw to use with the computerized system to have a specific decision. On our paper, on the other hand, it depends on the patient’s memory to have an approximate decision to start with.
Background and Related Works
Multilayer Perceptron is one of the most frequently used neural network models due to its clear architecture and comparably simple algorithm. The multilayer perceptron system consists of 3 layers: one input layer, one or more hidden layers, one output layer. The multilayer perceptron is a feed-forward network, which means that each layer receives the input from the previous layer. In other words, the signals flow from the input to the first hidden layer forwarded to the next until finally reached to the output layer (hopefully). The feed-forward structures have proved most useful in solving non-linearly separable problems [9-14]. The process in Multilayer perceptron starts when the input layer serves the values of the input variables to the first (or the only) hidden layer. Then the hidden or the output layer units (depends on which layer is an intern to process) calculate its activation value by taking the weighted sum of the outputs of the units in the preceding layer. The activation value is passed through the activation function to produce the output of the neuron. When the process is executed, the last output (of the output layer) is the output of the whole process.
MLP neural networks have been applied successfully to solve difficult and diverse problems by training them in a supervised manner with a highly popular algorithm known as Back Propagation which uses the data to adjust the network’s weights and biases in a manner that minimizes the error in its predictions on the training set [15,16]. Back-Propagation is the training or learning algorithm rather than the network itself (Robert Gordon University) (Figure 1). To notice more about it lets consider the following example: So, if we put in the first pattern to the network, we would like the output to be 0 1 as shown in next Figure (a black pixel is represented by 1 and a white by 0) (Figure 2). The input and its corresponding target are called a Training Pair. Once the network is trained, it will provide the desired output for any of the input patterns.
A complementary learning fuzzy neural network was proposed in [1] for Ovarian cancer diagnosis. In [17-19] a modified fuzzy cellular neural network was proposed to effectively segment CT liver images, which will help in the early diagnosis of liver cancer. Adaptive Neuro-Fuzzy Inference System (ANFIS) is one of the intelligent systems that showed promising performance in different aspects of our life, and more widely in medical applications. ANFIS has been implemented in many medical diagnoses such as human action recognition [20-23] and epilepsy seizure [16,24]. Contentbased image retrieval system, as a tool for discrimination between the normal and abnormal medical images, was developed in [25], heart valve diseases [26], rheumatoid arthritis [27,28], prostate cancer [7,29-32], and breast cancer [33]. ANFIS showed an overall accuracy in detecting glaucoma of 90.0% as reported in [33]. ANFIS illustrated a better performance in detecting four types of a brain tumour when compared with the performance of probabilistic neural network classifiers [34]. System Architecture This paper is consisting of two systems, each with four different experiments; the first and the second systems are using linear activation function and TANH activation function as an output function respectively. And the main difference between their experiments is the input variables. These variations and different experiments are done to achieve higher classification accuracy, by adding or suspending some inputs. The input variables are between 30-32 inputs that are gathered from patients or patients’ relatives depending on their remembrance. These variables can be divided into five categories: a) Basic information of a patient (including the age and the gender) b) The patient’s history (before dialyses) c) The patient’s family history d) Symptoms e) Physical examination The questionnaire that is used was filled by patients. The survey consists of easy questions to normal people is specific and direct to the point. Most of the attributes were assigned to have a yes or no value to indicate the presence or absence of an attribute. Other attributes are not so they will be designated as follows: a) Age when starting dialysis, vintage years on dialysis, and time of diagnosis of high blood pressure, all values are normalized into the range (0-1). b) Hemoglobin A1C test and daily blood sugar have three values (low, high, average). Input Variables Encoding Scheme Neural networks only deal with numerical values; therefore, the 32 (or 31, 30) variables are encoded into numerical values using the following structure scheme:
Age, number of years on dialysis, and time of diagnosis of high blood pressure are all are normalized into range (0-1). a) Variables with two attributes are referred to 0 and 1 while 0 is for the absence of specific symptom and 1 for the appearance of that symptom. Also, 0 represents the male, 1 represent the female, in gender attribute. b) Variables with 3 attributes like types of sugar test are referred to -1, 0, 1 where -1 represents the low term, 0 represents the average, and 1 represents the high term. After encoding, the training dataset was standardized to have a zero mean and a unit standard deviation and based on the information from the training dataset during the standardization; the validation and test datasets were also normalized to have a zero mean and a unit standard deviation. Number of Hidden Layers and Hidden Neurons Because determining the number of hidden layers and hidden neurons in each layer in feedforward networks is one of the unresolved functions. So repeated process to figure the best number of hidden layers and neurons in each hidden layer is used. In the repeated process, a ten-fold cross-validation technique is used to access the generalization for each architecture. The whole process works as follow: Step1: Start testing with one hidden layer; by applying the following equation to find the number of neurons in the first hidden layer: n nn ( ) / 2 (1) f i o = + Where nf is the number of neurons in the first hidden layer, ni is the number of neurons in the input layer and no is the number of the neurons in the output layer. If nf was odd (not a fixed) number then apply the ceil and the floor operations, so you will get two values, and for more precise results take another number which is floor (nf )-1, so you will have three numbers of neurons in the first layer to start with. Step 2: Add another layer; the number of neurons in this layer will be half the number of neurons in the previous layer. Step 3: Repeat step two until the number of the hidden neurons in the layer is equal to one. Data Preparation In this paper, the renal failure because dataset used to test and train our systems is consisting of the total number of 180 cases. For six diseases, 30 cases for each, gathered from seven hospitals in Jordan: Note that the 180 cases are taken from a total of 313 questionnaires, and it reduces to 180 due to the lake of cases in the VUS and GN which reach to 30 and 34 respectively. This paper is consisting of two models, each with four different experiments; the first and the second models are using linear activation function and TANH activation function as an output function, respectively. And the main difference between their experiments is the input variables. These variations and different experiments are done to achieve higher classification accuracy, by adding or suspending some inputs. The input variables are between 30-32 inputs that are gathered from patients or patients’ relatives depending on their remembrance. These variables can be divided into five categories: Basic information of a patient (including the age and the gender). f) The patient’s history (before dialyses). g) Patients’ family history. h) Symptoms. i) Physical examination. To estimate the performance of the system, its accuracy and improve its generalization we used a technique called crossvalidation, it determines the accuracy by dividing the number of correct classifications by the overall number of records in the dataset (Yan et al., 2006). This technique work by partitioning the dataset into training data, validation data and testing data, training data used to perform the analysis while testing data for test the analysis and validation data to avoid overfitting of the network [35-37]. In this paper we used 10 folds that represent different partitions, to improve generalization for the entire networks model, each fold consists of training data, testing data and validation data. Percentage of training data 80%, validation data 10% and testing data 10%, Since we have dataset consist of 424 records then the training data have 340 records, 42 records for testing data and 42 records for validation data for each fold. For the training data, we have 170 records represent benign diagnosis and 170 malignant diagnoses. And for validation and testing data we have 21 records represent benign diagnosis and 21 malignant diagnoses. Experimental Results In our model, four experiments are done depending on the number of inputs The following parameters are used:
a) A Feedforward Back-propagation neural network is used for building all models.
b) The number of neurons, in four models, in the input layer are 32, 31, 31, 30 (representing the symptoms before renal failure)
c) The number of neurons in the output layer is 1 in all models (representing one class of the cause the neural will generate). d) The training algorithm that was used for training the models is Resilient Backpropagation. e) The activation function that is used for all the hidden layers is TANH, on the other hand, the activation function that is used in the output layer is LF or TANH (the difference between the two models). f) The following values were used 0.0000001, and 6 for the performance goal error and the number of validation checks to avoid the overfitting of the network, respectively. When starting to build the two models and the four experiments in each model, we started with one hidden layer and ended with 5-6 hidden layers depends on the experiment (Table 1).
There is not much difference between experiment 2&3, and this due to the two input variables that are ignored during these experiments. These two inputs are connected, that they are talking about age in general. Considering that the second one is better to use slightly due to some reasons: First one of these diseases are connected to the age as some of them reflect kidneys during a long time while others are not. And due to the difference of age in samples, some diseases affected young people and some are not, which affects the number of years during dialysis. Another reason, diseases like diabetes with lack of care may affect other organs, which lead to death during a small amount of time. Model 2 (TANH) In model 2 TANH activation function is used between the last hidden layer and the output layer. Model 2 did not succeed at all. All of its classification accuracy in the four experiments is within To conclude from previous results of models, the LF model works more properly than TANH model no mater number of inputs we use. And experiments are the only way to determine, the best network architecture that can be used to solve a specific problem (Table 3). Conclusion In this research, the maximum result that we reach was about 67 classification accuracy. The result was expected, due to the kind of questionnaire that we used. As we mentioned before this questionnaire is depending on patients and their relative’s memory. Note that there are many patients spent on dialysis for about 20 -30 years, and the questionnaire is about what happens before dialysis, which means that patients have to remember things they suffered from 20-30 years ago. Another thing to consider is the age of these patients, very old patients and very young patients do not always focus, this makes us ask their close relatives that makes us face another issue, patients might not be aware of some detailed questions which make us fall in mistake of guessing. Additionally, the experiments do not depend on lab results because most of them are missing. Finally, to get a system that helps doctors to predict the renal failure cause the percentage of 68%, within 5 minutes and with no lab tests, is somehow a success system. And I hope that this may help doctors to deal with serious situations, in the lake of time.
To Know More About Trends in Technical and Scientific Research Please click on: https://juniperpublishers.com/ttsr/index.php
To Know More About Open Access Journals Please click on: https://juniperpublishers.com/index.php
0 notes
wei-yiing · 3 years
Text
Tumblr media
mdss ad ep 2 listen let's fucjing GO
also the jc on this episodes cover is Dashing i must say
i really like the sound direction on this ad. i think it's really well balanced ! still struggling to get used to this wwx though
omg sobbing lil apple is RINGO-CHAN
also ayumu murase as jin ling (kin ryou) is SPOT ON i was worried he would be exactly like hinata but he doesn't sound like hinata at all!! we stan a versatile king
aaah jiang cheng (kou chou) is here and he's not as scary as other adaptations? still scary, but maybe it's cos midorikawa's voice isn't as low as guo haoran
oh my god he's only said two words but i love satoshi hino's lan wangji (ran bouki). he sounds exactly like lwj should omg i love him.
as much as tatsuhisa suzuki isn't the wwx i'm used to he's actually doing a good job if i was just to assess his voice acting skills. he conveys the emotion of the character really well in certain parts, in a way that's realistic and not too over the top
wait i really like the music in this. like the background music it's so good. honestly the sound production is *mwah*
also nameless song still makes me cry. everything to do with mdzs makes me cry. sobs brb gotta expell my tears Although it is a little jarring going from the japanese voice acting to the chinese ending song lol a japanese cover of nameless song would have been nice
how is the episode over already wtf. i'm just gonna watch the next one lol
also i hear the og wangxian.mp3 in the preview... 👀
19 notes · View notes