<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>tensorflow Archives - Petamind</title>
	<atom:link href="https://petaminds.com/tag/tensorflow/feed/" rel="self" type="application/rss+xml" />
	<link>https://petaminds.com/tag/tensorflow/</link>
	<description>A.I, Data and Software Engineering</description>
	<lastBuildDate>Sat, 26 Mar 2022 02:11:59 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Advanced Keras &#8211; Custom loss functions</title>
		<link>https://petaminds.com/advanced-keras-custom-loss-functions/</link>
					<comments>https://petaminds.com/advanced-keras-custom-loss-functions/#comments</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Wed, 23 Mar 2022 00:31:00 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[cost function]]></category>
		<category><![CDATA[custom loss]]></category>
		<category><![CDATA[K]]></category>
		<category><![CDATA[keras]]></category>
		<category><![CDATA[keras backend]]></category>
		<category><![CDATA[loss function]]></category>
		<category><![CDATA[neural network]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1391</guid>

					<description><![CDATA[<p>When working on machine learning problems, sometimes you want to construct your own custom loss function(s). This article will introduce abstract Keras backend for that purpose. Keras loss functions From Keras loss documentation, there are several built-in loss functions, e.g. mean_absolute_percentage_error, cosine_proximity, kullback_leibler_divergence etc. When compiling a Keras model, we often pass two parameters, i.e. [&#8230;]</p>
<p>The post <a href="https://petaminds.com/advanced-keras-custom-loss-functions/">Advanced Keras &#8211; Custom loss functions</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/advanced-keras-custom-loss-functions/feed/</wfw:commentRss>
			<slash:comments>5</slash:comments>
		
		
			</item>
		<item>
		<title>Continue training big models on less powerful devices</title>
		<link>https://petaminds.com/continue-training-big-models-on-less-powerful-devices/</link>
					<comments>https://petaminds.com/continue-training-big-models-on-less-powerful-devices/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Sun, 22 Mar 2020 00:51:57 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[check-point]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[keras]]></category>
		<category><![CDATA[model]]></category>
		<category><![CDATA[out of memory]]></category>
		<category><![CDATA[save]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=2338</guid>

					<description><![CDATA[<p>It would not be a surprise that you may not have a powerful expensive machine to train a complicate model. You may experience the problem of not enough memory during training in some epoch. This article demonstrates a simple workaround for this. The problem Training deep learning models requires a lot of computing power. For [&#8230;]</p>
<p>The post <a href="https://petaminds.com/continue-training-big-models-on-less-powerful-devices/">Continue training big models on less powerful devices</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/continue-training-big-models-on-less-powerful-devices/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Generate data on the fly &#8211; Keras data generator</title>
		<link>https://petaminds.com/generate-data-on-the-fly-keras-data-generator/</link>
					<comments>https://petaminds.com/generate-data-on-the-fly-keras-data-generator/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Fri, 31 Jan 2020 23:13:05 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[generator]]></category>
		<category><![CDATA[keras]]></category>
		<category><![CDATA[python]]></category>
		<category><![CDATA[sequence]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1906</guid>

					<description><![CDATA[<p>Previously, we train our model using the pre-generated dataset, for example, in the recommender system or recurrent neural network. In this article, we will demonstrate using a generator to produce data on the fly for training a model. Keras Data Generator with Sequence There are a couple of ways to create a data generator. However, [&#8230;]</p>
<p>The post <a href="https://petaminds.com/generate-data-on-the-fly-keras-data-generator/">Generate data on the fly &#8211; Keras data generator</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/generate-data-on-the-fly-keras-data-generator/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>deep learning: Linear Autoencoder with Keras</title>
		<link>https://petaminds.com/deep-learning-linear-autoencoder-with-keras/</link>
					<comments>https://petaminds.com/deep-learning-linear-autoencoder-with-keras/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Mon, 09 Dec 2019 02:22:01 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[visualization]]></category>
		<category><![CDATA[auto]]></category>
		<category><![CDATA[decoder]]></category>
		<category><![CDATA[dimension]]></category>
		<category><![CDATA[encoder]]></category>
		<category><![CDATA[keras]]></category>
		<category><![CDATA[reduction]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=2027</guid>

					<description><![CDATA[<p>This post introduces using linear autoencoder for dimensionality reduction using TensorFlow and Keras. What is a linear autoencoder An autoencoder is a type of artificial neural network used to learn efficient data codings in an unsupervised manner. The aim of an autoencoder is to learn a representation (encoding) for a set of data, typically for dimensionality reduction, by training the network to ignore signal “noise”. Autoencoders consists [&#8230;]</p>
<p>The post <a href="https://petaminds.com/deep-learning-linear-autoencoder-with-keras/">deep learning: Linear Autoencoder with Keras</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/deep-learning-linear-autoencoder-with-keras/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Quick Benchmark Colab CPU GPU TPU (XLA-CPU)</title>
		<link>https://petaminds.com/quick-benchmark-colab-cpu-gpu-tpu-xla-cpu/</link>
					<comments>https://petaminds.com/quick-benchmark-colab-cpu-gpu-tpu-xla-cpu/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Tue, 12 Nov 2019 00:51:00 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[benchmark]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[tensorflow]]></category>
		<category><![CDATA[TPU]]></category>
		<category><![CDATA[XLA]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1809</guid>

					<description><![CDATA[<p>If you ever wonder about the performance differences between CPU, GPU, and TPU for your machine learning project, this article shows a simple benchmark for these three. Memory Subsystem Architecture Central Processing Unit (CPU), Graphics Processing Unit (GPU) and Tensor Processing Unit (TPU) are processors with a specialized purpose and architecture. CPU: A processor designed [&#8230;]</p>
<p>The post <a href="https://petaminds.com/quick-benchmark-colab-cpu-gpu-tpu-xla-cpu/">Quick Benchmark Colab CPU GPU TPU (XLA-CPU)</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/quick-benchmark-colab-cpu-gpu-tpu-xla-cpu/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>TF2.0 Warm-up exercises (forked from @chipHuyen Repo)</title>
		<link>https://petaminds.com/tf2-0-warm-up-exercises-forked-from-chiphuyen-repo/</link>
					<comments>https://petaminds.com/tf2-0-warm-up-exercises-forked-from-chiphuyen-repo/#comments</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Fri, 08 Nov 2019 01:54:24 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[exercise]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[tensorflow]]></category>
		<category><![CDATA[tf2]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1681</guid>

					<description><![CDATA[<p>Heard of Ms @huyen chip for her notable yet controversial travelling books back in the day. I enjoy reading but I am not really into travel memoirs. Nevertheless, she did surprise everyone by her achievements by getting in Stanford, teaching TensorFlow, and then became a computer/data scientist. Her story is definitely very inspiring. For ones who [&#8230;]</p>
<p>The post <a href="https://petaminds.com/tf2-0-warm-up-exercises-forked-from-chiphuyen-repo/">TF2.0 Warm-up exercises (forked from @chipHuyen Repo)</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/tf2-0-warm-up-exercises-forked-from-chiphuyen-repo/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
			</item>
		<item>
		<title>Dimension, Dimension, Dimension &#8211; Reshape your data</title>
		<link>https://petaminds.com/dimension-dimension-dimension-reshape-your-data/</link>
					<comments>https://petaminds.com/dimension-dimension-dimension-reshape-your-data/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Mon, 28 Oct 2019 21:45:59 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[dimension]]></category>
		<category><![CDATA[numpy]]></category>
		<category><![CDATA[pandas]]></category>
		<category><![CDATA[reshape]]></category>
		<category><![CDATA[tensorflow]]></category>
		<category><![CDATA[tf]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1553</guid>

					<description><![CDATA[<p>The most basic yet important thing when working with data array is its dimensions. This article will cover several data shapes and reshaping techniques. Why need reshaping data Imagine that you are starving and suddenly given a piece of delicious food. You may try to put it all in your mouth (Fig 1a) and find [&#8230;]</p>
<p>The post <a href="https://petaminds.com/dimension-dimension-dimension-reshape-your-data/">Dimension, Dimension, Dimension &#8211; Reshape your data</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/dimension-dimension-dimension-reshape-your-data/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>New TensorFlow 2.0 vs 1.X &#8211; Quick note</title>
		<link>https://petaminds.com/new-tensorflow-2-0-vs-1-x-quick-note/</link>
					<comments>https://petaminds.com/new-tensorflow-2-0-vs-1-x-quick-note/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Thu, 10 Oct 2019 08:25:53 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[pytorch]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1048</guid>

					<description><![CDATA[<p>TensorFlow 2.0 is out! Get hands-on practice at TF World, Oct 28-31. TensorFlow Ads Since the TF2.0 API reference lists have already been made publicly available, TF2.0 is still in RC.2 version. It is expected that the final release will be made available in the next few days (or weeks). What&#8217;s new in TF2.0: The [&#8230;]</p>
<p>The post <a href="https://petaminds.com/new-tensorflow-2-0-vs-1-x-quick-note/">New TensorFlow 2.0 vs 1.X &#8211; Quick note</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/new-tensorflow-2-0-vs-1-x-quick-note/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Save, restore, visualise Graph with TensorFlow v2.0 &#038; KERAS</title>
		<link>https://petaminds.com/save-restore-visualise-graph-with-tensorflow-v2-0-keras/</link>
					<comments>https://petaminds.com/save-restore-visualise-graph-with-tensorflow-v2-0-keras/#comments</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Tue, 08 Oct 2019 12:36:30 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[front-end]]></category>
		<category><![CDATA[Project]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[keras]]></category>
		<category><![CDATA[MNIST]]></category>
		<category><![CDATA[model]]></category>
		<category><![CDATA[restore]]></category>
		<category><![CDATA[save]]></category>
		<category><![CDATA[tensor]]></category>
		<category><![CDATA[tensorboard]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=1031</guid>

					<description><![CDATA[<p>TensorFlow 2.0 is coming really soon. Therefore, we quickly show some useful features, i.e., save and load a pre-trained model, with v.2 syntax. To make it more intuitive, we will also visualise the graph of the neural network model. Benefits of saving a model Quick answer: to save time, easy-share, and fast deploy. A SavedModel [&#8230;]</p>
<p>The post <a href="https://petaminds.com/save-restore-visualise-graph-with-tensorflow-v2-0-keras/">Save, restore, visualise Graph with TensorFlow v2.0 &#038; KERAS</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/save-restore-visualise-graph-with-tensorflow-v2-0-keras/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
			</item>
		<item>
		<title>A.I in agriculture &#8211; Fruit Grading with Keras (part 2)</title>
		<link>https://petaminds.com/a-i-in-agriculture-fruit-grading-with-keras-part-2/</link>
					<comments>https://petaminds.com/a-i-in-agriculture-fruit-grading-with-keras-part-2/#respond</comments>
		
		<dc:creator><![CDATA[Tung Nguyen]]></dc:creator>
		<pubDate>Sun, 06 Oct 2019 07:05:15 +0000</pubDate>
				<category><![CDATA[data science]]></category>
		<category><![CDATA[front-end]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[fruit]]></category>
		<category><![CDATA[fruit grading]]></category>
		<category><![CDATA[keras]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[neural network]]></category>
		<category><![CDATA[tensorflow]]></category>
		<guid isPermaLink="false">https://petaminds.com/?p=994</guid>

					<description><![CDATA[<p>In part 1, we introduced fruit classification with pure python implementation. In this part, we will use the Keras library instead. What is Keras Keras&#160;is an&#160;open-sourceneural-network&#160;library written in&#160;Python. It is capable of running on top of&#160;TensorFlow,&#160;Microsoft Cognitive Toolkit,&#160;Theano, or&#160;PlaidML. Designed to enable fast experimentation with&#160;deep neural networks, it focuses on being user-friendly, modular, and extensible.&#160; [&#8230;]</p>
<p>The post <a href="https://petaminds.com/a-i-in-agriculture-fruit-grading-with-keras-part-2/">A.I in agriculture &#8211; Fruit Grading with Keras (part 2)</a> appeared first on <a href="https://petaminds.com">Petamind</a>.</p>
]]></description>
		
					<wfw:commentRss>https://petaminds.com/a-i-in-agriculture-fruit-grading-with-keras-part-2/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
