What is the best library for working with neural network ? | Sololearn: Learn to code for FREE!
Nouvelle formation ! Tous les codeurs devraient apprendre l'IA générative !
Essayez une leçon gratuite
+ 13

What is the best library for working with neural network ?

What is the best library for working with deep neural network ? i worked by tensorflow and theano but their syntax is a bit hard

2nd Feb 2017, 2:17 PM
abolfazl meyarian
abolfazl meyarian - avatar
13 Réponses
+ 17
There's several popular libraries I know of but for python I think you should use Theano. Theano is a very flexible neural network library for use with Python.  It is capable of working on CPU and GPU.  I've found it to have the best documentation out of the neural net libraries I  use. You can find the documentation here http://deeplearning.net/software/theano/
4th Feb 2017, 10:17 AM
Baba Yunus Abdul Yekin
Baba Yunus Abdul Yekin - avatar
+ 7
try keras. it works on top of the two you mentioned, but is simpler to work with.
4th Feb 2017, 5:49 AM
lafat
+ 6
Write it yourself from scratch ! Doing so will teach you a lot , and it's not too hard to do.
5th Feb 2017, 11:12 AM
Edwin Martens
Edwin Martens - avatar
+ 6
"...syntax is a bit hard." It happens. If you want to look a little more at TensorFlow, just...don't forget your innovative ideas (the ones you have on your own) when you look where other people have been going: http://playground.tensorflow.org/ - Interactive TensorFlow Visualizer (no coding) - Adjust parameters (view node states, set functions, select features...) - things you might use in your project, for better perspective. https://youtu.be/4n1AHvDvVvw - Keynote / TensorFlow Dev Summit 2017 (yesterday Feb-15; start of a day's videos) - Code samples, release 1, where it's going, etc. For writing your own basic bits as @Edwin Martins suggests, Coursera's Machine Learning course (Andrew Ng) walks you through some "roll-your-own" in Octave (including teaching you Octave syntax): https://www.coursera.org/learn/machine-learning
16th Feb 2017, 10:42 PM
Kirk Schafer
Kirk Schafer - avatar
+ 4
There's several popular libraries: Theano - Welcome - Theano 0.7 documentation Torch - Torch | Scientific computing for LuaJIT Caffe - Caffe | Deep Learning Framework TensorFlow - https://www.tensorflow.org/ MXNet - https://mxnet.readthedocs.org/en/latest/&lc=en-IN&s=1&m=968&host=www.google.co.in&ts=1486232240&sig=AJsQQ1CvlSA58ORQy-f_MPyrU2x_bCL_1g
4th Feb 2017, 6:24 PM
Ranjeet Kumar
+ 4
I found this site to be very helpfull in writing AI and simulations: http://natureofcode.com/book/acknowledgments/
5th Feb 2017, 11:16 AM
Edwin Martens
Edwin Martens - avatar
+ 3
go for keras ))
4th Feb 2017, 11:33 AM
Rafatoov
Rafatoov - avatar
+ 3
i am using pycharm for implementing feed forwarded neural network,it is very easy to implement as you just have to import the inbuilt packages.
6th Feb 2017, 4:47 PM
decoder
decoder - avatar
+ 1
several library for neural network: -Tensor flow opensource by google -Torch it is used with the Lua language -Theano use with Python -Caffe Written for C++ with CUDA -Mxnet source: www.quora.com/what-is-the-best-open-source-neural-network
6th Feb 2017, 10:19 PM
Jatmiko Indriyanto
Jatmiko Indriyanto - avatar
0
I am a beginner I don't know anything about it
4th Feb 2017, 2:38 PM
banini
0
thanks my friends
4th Feb 2017, 4:24 PM
abolfazl meyarian
abolfazl meyarian - avatar
0
use keras or write your own. here is one for you. import numpy as np # X = (hours sleeping, hours studying), y = score on test X = np.array(([2, 9], [1, 5], [3, 6]), dtype=float) y = np.array(([92], [86], [89]), dtype=float) # scale units X = X/np.amax(X, axis=0) # maximum of X array y = y/100 # max test score is 100 class Neural_Network(object): def __init__(self): #parameters self.inputSize = 2 self.outputSize = 1 self.hiddenSize = 3 #weights self.W1 = np.random.randn(self.inputSize, self.hiddenSize) # (3x2) weight matrix from input to hidden layer self.W2 = np.random.randn(self.hiddenSize, self.outputSize) # (3x1) weight matrix from hidden to output layer def forward(self, X): #forward propagation through our network self.z = np.dot(X, self.W1) # dot product of X (input) and first set of 3x2 weights self.z2 = self.sigmoid(self.z) # activation function self.z3 = np.dot(self.z2, self.W2) # dot product of hidden layer (z2) and second set of 3x1 weights o = self.sigmoid(self.z3) # final activation function return o def sigmoid(self, s): # activation function return 1/(1+np.exp(-s)) NN = Neural_Network() #defining our output o = NN.forward(X) print ("Predicted Output: \n" + str(o)) print ("Actual Output: \n" + str(y))
5th May 2018, 5:20 PM
ravi shankar
ravi shankar - avatar
- 2
please teach me how to print a variable out
4th Feb 2017, 2:37 PM
banini