Connect with us

Big Data

CNN based Dog Breed Classifier Using Stacked Pretrained Models



This article was published as a part of the Data Science Blogathon

In this article, we will learn how to classify images based on fine details of images using a stacked pre-trained model to get maximum accuracy in TensorFlow.


Hey folks, I hope you have done some image classification usi-trained TensorFlow or TensorFlowor other CNN pre-trained models and might have some idea about how we classify images, but when it comes to classifying finely detailed objects (dog breed, cat breed, leaves diseases) this method doesn’t give us a good result, in this case, we would prefer model stacking to capture most of the details. Let’s get straight to the technicalities of it.

In our dataset, we have 120 dog breeds and we will have to classify them using a stacked pre-trained model (TensorFlow, Densenet121) which is trained on Imagenet. We will stack bottleneck features extracted by these models for greater accuracy that will depend on the models we are stacking together.

How do Stacked Pretrined Models work?

When you work on a classification problem you tend to use a classifier that majorly focuses on the max pooled features that mean it does take fine or small objects into account while training. This is why we use stacked models that help to easily classify the images based on both highlighted and fine object details.

For creating a stacked model you need to use two or more classification architectures like Resnet, Vgg, Densenet, etc. These classifiers take an image as input and generate feature matrices based on their architecture. Normally each classifier goes ahead with the following stages in order to create a feature vector:

1. Convolution: It is the process of generating feature maps that depict the different image-specific features like edges, sharpness, etc of an image.

2. Max Pooling: In this process highlighted features are extracted from the feature maps that are generating using the convolution process.

3. Flattening: In this process, final feature maps are converted into a vector of features.

after getting the feature vectors from different models we stack them together in order to come up with a feature matrix that is then used as an input for the final Neural Network model whose work is to classify these matrices into final classes of data.

Now that you know how a stacked model works let’s design a stacked model using Vgg16 and Resnet architectures.

Before moving ahead let’s have a look at how we stack models to get bottleneck features.

 designing stacked pretrained models

in the above image, you can clearly see that we have two classifier models VGG16 and Resnet which are used as feature extractors and then stacked together to come up with a final set of features used for classification.

Let’s write the code to generate the following architecture:

from keras.applications.resnet_v2 import ResNet50V2 , preprocess_input as resnet_preprocess
from keras.applications.densenet import DenseNet121, preprocess_input as densenet_preprocess
from keras.layers.merge import concatenateinput_shape = (331,331,3)
input_layer = Input(shape=input_shape)#first feature extractor
preprocessor_resnet = Lambda(resnet_preprocess)(input_layer)
resnet50v2 = ResNet50V2(weights = 'imagenet', include_top = False,input_shape = input_shape,pooling ='avg')(preprocessor_resnet)preprocessor_densenet = Lambda(densenet_preprocess)(input_layer)
densenet = DenseNet121(weights = 'imagenet', include_top = False,input_shape = input_shape,pooling ='avg')(preprocessor_densenet)merge = concatenate([resnet50v2,densenet])
stacked_model = Model(inputs = input_layer, outputs = merge)
model summary
Source: Local

Here we have stacked two models (Densenet121, and resnet50V2) both have include_top = False means we are only extracting bottleneck features and then using concatenate layer for merging them.

Building Dog Breed Classifier Using Stacked Pretrained Models

Now that you know how you can create a stacked model we can go ahead and start creating a Dog Breed Classifier.

training | stacked pretrained models

The above diagram shows the workflow of training and inferencing the classifier. We will train a stacked model on our images which will generate a feature matrix which would then be used as input features by another Neural Network to classify the Dog Breeds.

Now that you got a high-level picture of how the approach works. Let’s take a look at the step-by-step procedure for training and inferencing.

Loading Dataset

We would be using a dataset from Kaggle that you can find here( We will be Loading the data in a pandas dataframe named labels_dataframe and will convert each y label into a numerical value.

We have mapped every dog breed label with some numbers.

#Data Paths
train_dir = '/kaggle/input/dog-breed-identification/train/'
labels_dataframe = pd.read_csv('/kaggle/input/dog-breed-identification/labels.csv')
dog_breeds = sorted(list(set(labels_dataframe['breed'])))
n_classes = len(dog_breeds)
class_to_num = dict(zip(dog_breeds, range(n_classes)))
labels_dataframe['file_path'] = labels_dataframe['id'].apply(lambda x:train_dir+f"{x}.jpg")
labels_dataframe['breed'] =
label image
Source: Local

Now we will have to convert the breed column into y_train using to_categorical.

from keras.utils import to_categorical
y_train = to_categorical(labels_dataframe.breed)

Extract Bottleneck Features in Stacked Pretrained Models

In this step, we will use the stacked_model that we have just designed for bottleneck feature extraction and these extracted features will become our X_train for our training. we are using batches so that we won’t have any OOM (Out of Memory) issues.

# for feature_extraction dataframe must have to contain file_path and breed columns
def feature_extractor(df): img_size = (331,331,3) data_size = len(df) batch_size = 20 X = np.zeros([data_size,3072], dtype=np.uint8) datagen = ImageDataGenerator() # here we dont need to do any image augementaion because we are prediction features generator = datagen.flow_from_dataframe(df, x_col = 'file_path', class_mode = None, batch_size=20, shuffle = False,target_size = (img_size[:2]),color_mode = 'rgb') i = 0 for input_batch in tqdm(generator): input_batch = stacked_model.predict(input_batch) X[i * batch_size : (i + 1) * batch_size] = input_batch i += 1 if i * batch_size >= data_size: break return XX_train = feature_extractor(labels_dataframe)

here X_train (includes all features extracted by our pre-trained Model) is our new X_train which we will use for the final model.

Creating Predictor Model for Training

Now we will create a simple model that will take X_train (features extracted by stacked model) and y_train (categorical values) that will be the final predictor model.

import keraspredictor_model = keras.models.Sequential([ InputLayer(X.shape[1:]), Dropout(0.7), Dense(n_classes, activation='softmax')
])predictor_model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
from keras.callbacks import EarlyStopping,ModelCheckpoint, ReduceLROnPlateau
#Prepare call backs
EarlyStop_callback = keras.callbacks.EarlyStopping(monitor='val_loss', patience=10, restore_best_weights=True)
checkpoint = ModelCheckpoint('/kaggle/working/checkpoing', monitor = 'val_loss',mode = 'min',save_best_only= True)
lr = ReduceLROnPlateau(monitor = 'val_loss',factor = 0.5,patience = 3,min_lr = 0.00001)

this block of code will save the best epoch automatically ,
early-stopping and it will reduce learning rate if no further
improvement in training is seen.

Training the Model

Now we will train predictor_model on X_train and y_train and our X_testy_test will be automatically taken by validation_split. 

#Train simple DNN on extracted features.# here X is bottleneck feature extracted by using stacked pretrained model history_graph = , y_train, batch_size=128, epochs=60, validation_split=0.1 , callbacks = my_callback)

use validation_split = 0.1, will split dataset into train (90%) and test (10%).

Plot the results

We will plot the history of our training and figure out our performance. here history_graph is the history object which we will use to plot the history over every epoch.

import matplotlib.pyplot as plt
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(12, 12))
ax1.plot(history_graph.history['val_loss'],color = 'r',label = 'val_loss')
ax1.set_xticks(np.arange(1, 60, 1))
ax1.set_yticks(np.arange(0, 1, 0.1))
ax1.legend(['loss','val_loss'],shadow = True)
ax2.plot(history_graph.history['accuracy'],color = 'green',label = 'accuracy')
ax2.plot(history_graph.history['val_accuracy'],color = 'red',label = 'val_accuracy')
ax2.legend(['accuracy','val_accuracy'],shadow = True)
# ax2.set_xticks(np.arange(1, 60, 1))
# ax2.set_yticks(np.arange(0, 60, 0.1))
ax1.plot(history_graph.history['loss'],color = 'b',label = 'loss')
accuracy plot | stacked pretrained models
Source: Local

We have trained our model efficiently and got a validation accuracy of approx 85%. Well, we can improve it further we will talk about it at the end.

Save the Model

Finally, we will save the pre-trained model in order to use it for inference later.'/kaggle/working/dogbreed.h5')'/kaggle/working/feature_extractor.h5') 

Getting the Inferences from the Model

To begin with, we will extract bottleneck features of test_images by stacked_model and then we will pass extracted features to predictor_model to get class values.

img = load_img(img_path, target_size=(331,331))
img = img_to_array(img)
img = np.expand_dims(img,axis = 0) # this is creating tensor(4Dimension) extracted_features = stacked_model.predict(img)
y_pred = predictor_model.predict(extracted_features)

y_pred is a prediction array of shapes (1,120). y_pred is an array of having the probability of each class. now we need to find a class label that has the highest probability and then convert the class number to a class label using a
dictionary class_to_num that we have already defined.

def get_key(val): for key, value in class_to_num.items(): if val == value: return key pred_codes = np.argmax(y_pred, axis = 1)predicted_dog_breed = get_key(pred_codes)
output | stacked pretrained models

Source: Local


In this article, we built a dog breed classifier using stacked (densenet121, resnet50v2) and got a validation accuracy of over 85%. nevertheless, we can improve this accuracy furthermore ….

Improving Accuracy Further

  • Try to stack other deep layered pre-trained models (VGG19, VGG16, etc.)
  • Perform data augmentation before feature extraction 

Now you know that using different pre-trained models you can create multiple other classifier models that work better in classifying the images. You can go ahead and create your own stacked model for your use case.

Thanks for reading this article do like if you have learned something new, feel free to comment See you next time !!! ❤️ 

The media shown in this article on stacked pretrained models are not owned by Analytics Vidhya and are used at the Author’s discretion.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.


Big Data

Proximity labeling: an enzymatic tool for spatial biology



In this Forum, we highlight how cutting-edge, proximity-dependent, enzymatic labeling tools, aided by sequencing technology developments, have enabled the extraction of spatial information of proteomes, transcriptomes, genome organization, and cellular networks. We also discuss the potential applications of proximity labeling in the unexplored field of spatial biology in live systems.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.


Continue Reading

Big Data

Synthetic biology applications of the yeast mating signal pathway




Central carbon metabolism (CCM)

as the main source of energy, CCM oxidizes carbon through glycolysis, the pentose phosphate pathway, and the tricarboxylic acid cycle.


a cell host or an organism for the production of biochemicals such as enzymes by introducing synthetic modules or devices into the cell.


an assembly of biological parts that enables cells to perform logical functions, such as genetic switches, oscillators, and logic gates.

Convolutional neural network

a class of artificial neural networks with multiple building blocks that automatically and adaptively learn spatial hierarchies of features through back-propagation.

Clustered regularly interspaced short palindromic repeats (CRISPR)

a genome-editing tool in which CRISPR-associated nuclease 9 (Cas9)–guide RNA (gRNA) complexes recognize a protospacer adjacent motif through base-pairing and then cleave the target DNA,

CRISPR activation or interference (CRISPRa/i)

a tool that uses dead Cas protein and gRNA to activate or repress genes, resulting in gene upregulation or downregulation, respectively.

Cubic ternary complex model

an equilibrium model that describes the interactions between receptor and ligand. This model simulates the interactions of G proteins and receptors in both their active and inactive conformations.

G proteins

heterotrimeric G protein complexes are composed of α, β and γ subunits. Replacement of GDP by GTP in Gα causes a conformational change that dissociates the Gβγ subunits, leading to the activation of downstream signaling.

G protein-coupled receptor (GPCR)

a generic class of versatile, seven transmembrane-domain proteins that regulate a diverse array of intracellular signaling cascades in response to hormones, neurotransmitters, and other stimuli.


a cascade of molecular events that finally lead to fusion of the nuclei and the formation of diploid cells.

Metabolic engineering

a new scientific field that combines multi-gene recombination technology with metabolic regulation and biochemical engineering to overproduce desired products.

Mitogen-activated protein kinases (MAPKs)

a family of serine/threonine kinases that convert extracellular signals into a diverse range of cellular responses.


studies include genomics, transcriptomics, proteomics, and metabolomics that characterize and quantify pools of biological molecules, and together give rise to the field of integrative genetics.


a genetic circuit where oscillation is generated by the inhibition and activation of transcriptional/translational feedback loops.

Pheromone-response element (PRE)

a cis element that is present in multiple copies in the promoters of a variety of pheromone-responsive genes; PREs interact with Ste12 to initiate the transcription of pheromone-induced genes.

Quorum sensing

a cell density-dependent phenomenon in which cells adapt their behavior by synthesizing, secreting, perceiving, and reacting to small diffusible signaling molecules termed autoinducers.

Scaffold protein

proteins that recruit other proteins to form a functional unit, thus enhancing signaling efficiency and fidelity.


a Ste5 mutant that lacks the Gβγ-binding site because its N-terminus has been truncated; Ste5ΔN-CTM is no longer recruited to the plasma membrane following pheromone treatment.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.


Continue Reading

Big Data

Biotechnology of functional proteins and peptides for hair cosmetic formulations



  • New cosmetic science.

    Elsevier, 1997

    • Bouillon C.
    • Wilkinson J.

    The science of hair care.

    CRC Press, 2005

    • Pierce J.S.
    • et al.

    Characterization of formaldehyde exposure resulting from the use of four professional hair straightening products.

    J. Occup. Environ. Hyg. 2011; 8: 686-699

    • Ahmed M.B.
    • et al.

    Neurotoxic effect of lead on rats: relationship to apoptosis.

    Int. J. Health Sci. (Qassim). 2013; 7: 192-199

    • Martins M.
    • et al.

    α-Chymotrypsin catalysed oligopeptide synthesis for hair modelling.

    J. Clean. Prod. 2019; 237117743

    • Tinoco A.
    • et al.

    Fusion proteins with chromogenic and keratin binding modules.

    Sci. Rep. 2019; 9: 14044

    • Cruz C.F.
    • et al.

    Peptide–protein interactions within human hair keratins.

    Int. J. Biol. Macromol. 2017; 101: 805-814

    • Sajna K.V.
    • et al.

    White biotechnology in cosmetics.

    in: Pandey A. Industrial biorefineries and white biotechnology. Elsevier, 2015: 607-652

  • Role of protein in cosmetics.

    Clin. Dermatol. 2008; 26: 321-325

  • Yoshioka, I. and Kamimura, Y. Seiwa Kasei Co. Ltd. Keratin hydrolyzate useful as hair fixatives, US4279996.

  • Fahnestock, S.R. and Schultz, T.M. EI Du Pont de Nemours and Company. Water-soluble silk proteins compositions for skin care, hair care or hair coloring, US7060260B2.

  • Detert, M. et al. Beiersdorf AG. Hair styling preparations with special protein hydrolysates, EP1878423A2.

    • Barba C.
    • et al.

    Restoring important hair properties with wool keratin proteins and peptides.

    Fibers Polym. 2010; 11: 1055-1061

    • Fernandes M.M.
    • et al.

    Keratin-based peptide: biological evaluation and strengthening properties on relaxed hair.

    Int. J. Cosmet. Sci. 2012; : 338-346

    • Ribeiro A.
    • et al.

    Potential of human γD-crystallin for hair damage repair: insights into the mechanical properties and biocompatibility.

    Int. J. Cosmet. Sci. 2013; 35: 458-466

  • Ross, V.M. Further preparations of silk proteins, seed oils, monosaccharide, natural botanicals and polysaccharide mixtures in compositions for hair care or hair repair, and skin care and topical treatments, US9023404B2.

    • Cruz C.F.
    • et al.

    Effect of a peptide in cosmetic formulations for hair volume control.

    Int. J. Cosmet. Sci. 2017; 39: 600-609

  • Edman, W.W. and Klemm, E.J. Shiseido Co. Ltd. Permanent waving compositions, US4798722.

  • Lang, G. et al. LOreal SA. Cosmetic temporary coloring compositions containing protein derivatives, US5192332.

  • Tomita, M. et al. Iwase Cosfa Co. Ltd, Morinaga Milk Industry Co. Ltd. Milk-protein hydrolyzates and compositions for use as hair and skin treating agent, US5314783.

  • Igarashi, S. et al. Kanebo Ltd. Hair coloring composition comprising anti-hair antibodies immobilized on coloring materials, and hair coloring methods, US5597386.

  • Oshika, M. and Naito, S. Kao Corp. Acylated silk proteins for hair care, US5747015.

  • Shah, S.M. Johnson and Johnson Consumer Inc. Heat-safe hair preparation and method of using same, US6156295.

  • Cannell, D. and Nguyen, N. LOreal SA. Composition for treating hair against chemical and photo damage, US6013250.

  • Schultz, T.M. and Tran, H.T. EI Du Pont de Nemours and Company. Modified soy proteins in personal care compositions, US2005/0008604A1.

    • Isnard M.D.
    • et al.

    Development of hair care formulations based on natural ingredients.

    Int. J. Phytocosmet. Nat. Ingred. 2019; 6: 9

    • Tinoco A.
    • et al.

    Keratin-based particles for protection and restoration of hair properties.

    Int. J. Cosmet. Sci. 2018; 40: 408-419

    • Tinoco A.
    • et al.

    Keratin:Zein particles as vehicles for fragrance release on hair.

    Ind. Crop. Prod. 2021; 159113067

    • Camargo Jr., F.B.
    • et al.

    Prevention of chemically induced hair damage by means of treatment based on proteins and polysaccharides.

    J. Cosmet. Dermatol. 2021; ()

    • Malinauskyte E.
    • et al.

    Penetration of different molecular weight hydrolysed keratins into hair fibres and their effects on the physical properties of textured hair.

    Int. J. Cosmet. Sci. 2021; 43: 26-37

    • Cavallaro G.
    • et al.

    Halloysite/keratin nanocomposite for human hair photoprotection coating.

    ACS Appl. Mater. Interfaces. 2020; 12: 24348-24362

    • Baus R.A.
    • et al.

    Strategies for improved hair binding: keratin fractions and the impact of cationic substructures.

    Int. J. Biol. Macromol. 2020; 160: 201-211

  • Cetintas, S. New hair botox material and the method to apply this material to hair, US2020/0197287A1.

    • Basit A.
    • et al.

    Health improvement of human hair and their reshaping using recombinant keratin K31.

    Biotechnol. Rep. 2018; 20e00288

    • Schulze Zur Wiesche E.
    • et al.

    Prevention of hair surface aging.

    J. Cosmet. Sci. 2011; 62: 237-249

    • Daithankar A.V.
    • et al.

    Moisturizing efficiency of silk protein hydrolysate: silk fibroin.

    Indian J. Biotechnol. 2005; 4: 115-121

    • Fernandes M.
    • Cavaco-Paulo A.

    Protein disulphide isomerase-mediated grafting of cysteine-containing peptides onto over-bleached hair.

    Biocatal. Biotransform. 2012; 30: 10-19

    • Tinoco A.
    • et al.

    Crystallin fusion proteins improve the thermal properties of hair.

    Front. Bioeng. Biotechnol. 2019; 7: 298

    • Wistow G.
    • et al.

    Myxococcus xanthus spore coat protein S may have a similar structure to vertebrate lens βγ-crystallins.

    Nature. 1985; 315: 771-773

  • Azizova, M. et al. Henkel IP and Holding GmbH. Hair treatment composition with naturally-derived peptide identical to human hair, US9505820B2.

    • Cruz C.F.
    • et al.

    Changing the shape of hair with keratin peptides.

    RSC Adv. 2017; 7: 51581-51592

  • Hawkins, G. et al. ELC Management LLC. Compositions and methods for permanent straightening of hair, US9011828B2.

  • Dimotakis, E. et al. LOreal SA. Hair cosmetic and styling compositions based on maleic acid copolymers and polyamines, US2013/0309190A1.

    • Song K.
    • et al.

    Effects of chemical structures of polycarboxylic acids on molecular and performance manipulation of hair keratin Kaili.

    RSC Adv. 2016; 6: 58594-58603

    • Qin X.
    • et al.

    Enzyme-triggered hydrogelation via self-assembly of alternating peptides.

    Chem. Commun. (Camb.). 2013; 49: 4839-4841

    • Yazawa K.
    • Numata K.

    Recent advances in chemoenzymatic peptide syntheses.

    Molecules. 2014; 19: 13755-13774

  • Savaides, A. and Tasker, R. Zotos International Inc. Formulations and methods for straightening and revitalizing hair, US2014/0261518A1.

  • Anthony, M.M. Copomon Enterprises LLC, Keratin Holdings LLC. Method of preparing a hair treatment formulation comprising nanoparticles in solution and method of hair treatment utilizing a treatment formulation comprising nanoparticles in solution, US9078818B1.

  • Chahal, S.P. et al. Croda International PLC. Protein-acrylate copolymer and hair conditioning product comprising said polymer, US9421159B2.

  • Huang, X. et al. EI Du Pont de Nemours and Company. Peptide-based conditioners and colorants for hair, skin and nails, US7220405B2.

  • Slusarewiez, P. Unilever Home and Personal Care USA. Method of coloring hair, US6773462B2.

  • Benson, R.E. et al. EI Du Pont de Nemours and Company, Affinergy LLC. Hair binding peptides and peptide-based hair reagents for personal care, US8273337B2.

  • Chung, Y.J. et al. Peptide exhibiting hair growth promoting activity and/or melanin production promoting activity and use thereof, US10344061B2.

  • Vickers, E.R. Clinical Stem Cells Pty Ltd. Peptides for hair growth, US2019/0091494A1.

    • Günay K.A.
    • et al.

    Selective peptide-mediated enhanced deposition of polymer fragrance delivery systems on human hair.

    ACS Appl. Mater. Interfaces. 2017; 9: 24238-24249

    • Bolduc C.
    • Shapiro J.

    Hair care products: waving, straightening, conditioning, and coloring.

    Clin. Dermatol. 2001; 19: 431-436

    • Dias M.F.R.G.

    Hair cosmetics: an overview.

    Int. J. Trichol. 2015; 7: 2

    • Barba C.
    • et al.

    Effect of wool keratin proteins and peptides on hair water sorption kinetics.

    J. Therm. Anal. Calorim. 2010; 102: 43-48

    • Villa A.L.V.
    • et al.

    Feather keratin hydrolysates obtained from microbial keratinases: effect on hair fiber.

    BMC Biotechnol. 2013; 13: 15

    • Mancon S.
    • et al.

    Hair conditioning effect of vegetable native protein in shampoo formulations.

    Seifen Ole Fette Wachse J. 2012; 138: 38-42

    • Wang S.
    • et al.

    Modification of wheat gluten for improvement of binding capacity with keratin in hair.

    R. Soc. Open Sci. 2018; 5171216

  • Sahib, S. and Jungman, E. Aquis Hairsciences Inc. Composition for improving hair health, US2020/0069551A1.

    • Antunes E.
    • et al.

    The effects of solvent composition on the affinity of a peptide towards hair keratin: experimental and molecular dynamics data.

    RSC Adv. 2015; 5: 12365-12371

  • Hair: its structure and response to cosmetic preparations.

    Clin. Dermatol. 1996; 14: 105-112

    • Cruz C.
    • et al.

    Human hair and the impact of cosmetic procedures: a review on cleansing and shape-modulating cosmetics.

    Cosmetics. 2016; 3: 26

    • Robbins C.R.

    Chemical composition of different hair types.

    in: Chemical and physical behavior of human hair. Springer, 2012: 105-176

    • Antunes E.
    • et al.

    Insights on the mechanical behavior of keratin fibrils.

    Int. J. Biol. Macromol. 2016; 89: 477-483

    • Kutlubay Z.
    • Serdaroglu S.

    Anatomy and physiology of hair.

    in: Hair and scalp disorders. IntechOpen, 2017: 13-27

    • Harrison S.
    • Sinclair R.

    Hair colouring, permanent styling and hair structure.

    J. Cosmet. Dermatol. 2004; 2: 180-185

    • Draelos Z.D.

    Hair care: an illustrated dermatologic handbook.

    CRC Press, 2004

    • Takada K.
    • et al.

    Influence of oxidative and/or reductive treatment on human hair (I): analysis of hair-damage after oxidative and/or reductive treatment.

    J. Oleo Sci. 2003; 52: 541-548

    • Kuzuhara A.

    Analysis of structural changes in bleached keratin fibers (black and white human hair) using Raman spectroscopy.

    Biopolymers. 2006; 81: 506-514

    • Wolfram L.J.
    • et al.

    The mechanism of hair bleaching.

    J. Soc. Cosmet. Chem. 1970; 900: 875-900

    • Bagiyan G.A.
    • et al.

    Oxidation of thiol compounds by molecular oxygen in aqueous solutions.

    Russ. Chem. Bull. 2003; 52: 1135-1141

    • Blasi-Romero A.
    • et al.

    In vitro investigation of thiol-functionalized cellulose nanofibrils as a chronic wound environment modulator.

    Polymers (Basel). 2021; 13: 249

  • PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
    Click here to access.


    Continue Reading

    Big Data

    VW’s 9-month electric vehicle deliveries to China more than triple



    FRANKFURT (Reuters) – Volkswagen’s deliveries of battery-powered electric vehicles to China more than tripled in the first nine months of the year, the carmaker said on Friday, less than two months after it flagged the need to change its e-car strategy there.

    Deliveries of battery electric vehicles (BEV) to the world’s largest car market stood at 47,200 in the January-September period, up from 15,700 in the same period last year.

    “As planned, we significantly accelerated the BEV market ramp-up in China in the third quarter, and we are on track to meet our target for the year of delivering 80,000 to 100,000 vehicles of the ID. model family,” Christian Dahlheim, head of group sales, said.

    Volkswagen Chief Executive Herbert Diess in July said the carmaker had to change its approach to how it markets its BEVs in China after first-half deliveries stood at just 18,285.

    (Reporting by Christoph Steitz; Editing by Maria Sheahan)

    Image Credit: Reuters

    PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
    Click here to access.


    Continue Reading
    Blockchain1 day ago

    People’s payment attitude: Why cash Remains the most Common Means of Payment & How Technology and Crypto have more Advantages as a Means of payment

    Fintech4 days ago

    PNC cuts nearly 600 apps for BBVA conversion

    Automotive4 days ago

    This Toyota Mirai 1:10 Scale RC Car Actually Runs On Hydrogen

    Automotive2 days ago

    7 Secrets That Automakers Wish You Don’t Know

    Blockchain2 days ago

    What Is the Best Crypto IRA for Me? Use These 6 Pieces of Criteria to Find Out More

    Startups1 day ago

    The 12 TikTok facts you should know

    Gaming2 days ago

    New Steam Games You Might Have Missed In August 2021

    IOT2 days ago

    The Benefits of Using IoT SIM Card Technology

    Blockchain2 days ago

    The Most Profitable Cryptocurrencies on the Market

    Esports4 days ago

    New World team share details of upcoming server transfers in Q&A

    Gaming2 days ago

    How do casinos without an account work?

    Energy4 days ago

    Segunda Conferência Ministerial de Energia da Rota e Cinturão é realizada em Qingdao

    Gaming2 days ago

    Norway will crack down on the unlicensed iGaming market with a new gaming law

    Blockchain2 days ago

    What does swapping crypto mean?

    Esports5 days ago

    How TFT Set 6 Hextech Augments work: full list and updates

    Gaming2 days ago

    Norway will crack down on the unlicensed iGaming market with a new gaming law

    Supply Chain20 hours ago

    LPG tubes – what to think about

    Energy4 days ago

    People’s Daily Online: uma pesquisa do Instituto de Zoologia de Kumming mostrou um aumento estável da população de pavões verdes, uma espécie ameaçada de extinção

    Esports4 days ago

    How to play Scream Deathmatch Game Mode in Call of Duty: Black Ops Cold War

    AR/VR2 days ago

    Preview: Little Cities – Delightful City Building on Quest