As I look back to the year 2017, I see a big growth in my Technical skills as I experimented with many different tools, languages and frameworks available in the market. Hereby in today’s blog I’ll try to summarize most of them and give out tips on how it could help you to enhance your profile and in achieving your professional goals.
Here I Go !
That is a high-level framework of Python which allows the developer to create Dynamic websites easily. The best feature of Django is that it comes with the in-build Admin support. As Developers, we know how tedious it is to set up the admin support if we go out there and try to implement it ourselves, But then Django gives us this in-build feature which can be used by setting in personal id and password. This module helps in managing the Pages, User Information and custom Databases.
Django works on the very famous MVC – Model View Controller Model. Basically it’s just a bunch of folders in a directory in which all our code files are filtered based on their functionality. It also has custom code for adding into the html code that helps to integrate our default HTML+CSS3 Page with the Django Project.
It was one of the best Frameworks I came across in the recent times. Though it was really difficult to start at first, due to complex structure and code But a familiar Python coder would find it really easy after a few weeks on Hands On. Basic Understanding of Working of Website and url Mapping would make it easier for the learning this Framework.
TensorFlow is a famous Statistical learning Library developed by Google Brain Team. Though it has many in-build Machine Learning Algorithms, it is majorly used for implementation of Deep Learning Algorithms. Google Developers claim that it’s the most famous Predictive Analysis Library of the current times based on the GitHub forks. Since it’s release, it has seen a big growth in the number of Developers and people using it.
I am and always will be a scikit-learn fan, so I’ll admit that I was a bit anxious about diving into this library. But now that I have tried out this one, I can easily point out a few
major pointers about the library and it’s pros and cons.
TensorFlow is not for the extreme Beginners. It’s for some one who is transitioning from the theory base to the implementation base. One needs to be aware of all the steps involved in the implementation of an algorithm before actually trying to use in for your model. Though Google says the library is developer-friendly, I’ll not agree to them. You definitely need to put in many extra hours into understanding it’s attributes and parameters but it’s worth It !
If you have no idea about the working of an Machine learning Algorithm, I’ll suggest you to actually use some other Libraries and read through the working of the Algorithms. Only once you know what an algorithm does is when you can actually even implement it in Tensorflow or else you’ll just get tangled in the Code.
Also a Python based Deep Learning Library. It’s really good and even Tensorflow has Keras Plugin. I have used this library for multiple models such as Convolutional Neural networks, Simple Neural Networks and Recommendation Systems.
I can assure you that it’s the best library for someone who has no background in the field
It does all the work in the back-end and gives the developer leisure to just call functions for their work. Neural Network, a concept covered under Deep Learning, is easily implemented using this.
But getting addicted to it can be dangerous, as the developers keeps using it without understanding the Model and parameters at all. As long as you don’t wish to become a Data Scientist or a Deep Learning expert, it’s safe to use. Else, it’s recommended to actually learn the working of the Deep learning models.
IPython is available as a part of the Python package that’s installed, but that’s nothing as compared to the Jupyter Notebooks. It consists of rows where the developer can write code and run it just right away.
I had personally never used such a tool as a developer. If I wanted to run the code right away, I knew I had to use the command prompt. But when I used it for the first time, it really filled me up with a lot of enthusiasm as I suddenly didn’t need to open cmd and set paths and run it and I could just hit a key and it would run.
For all my ML / DL Project, I prefer to implement it in this Notebooks as I can run each and every cell and know exactly where the error lies in my code. In fact even if you have the code before hand, you can implement it step by step and that helps you the analyse what part does what in the code. A Must for all Python Developers.
MCS & OCR API
Acronym for Microsoft Cognitive Services API & Optical Character Recognition API respectively. It’s under the same point, as I used both of them in my 7th Semester bachelors Project on Image Recognition. The actual implementation of Image Recognition can be quite so intimidating since it involves collection of millions or billions of training images and then the whole process of train & Test. For a school-level learner, It’s almost next to impossible to do those steps.
For them these big companies have already done the whole process of Image Collection and processing and tagging, we just have to use their APIs for our project and edit it as per our project. These two APIs not only helped me in my project, but it even made me realize the immense amount of work, efforts and money these big companies like Google & Microsoft has invested in these Fields. It surely assures that the field has got a good future.
The OCR and MCS APIs provided me 90% and 75% of accuracy respectively. OCR is for recognizing the written lines/words/paragraphs. The MCS API is specially to recognize emotions and objects.
I came across these Tool just recently when I joined these Google Analytics Firm for my Winter Internship. Since data collection is a big part of any Predictive Model, these tools actually helps us for getting the Data. Prediction can thereafter be done once we have the data.
Besides that GA provides us with a lot of more functionalities, as It helps companies to understand the user interaction with their site. It’s already used by most of the major firms and is a tool that any Business Analyst SHOULD be aware of.
These is the BEST software I have come across in the year 2017. While there are many libraries available in Python through which user can visualize the Data. These tool is just so much more simpler. A person can upload the Data and see the visualization just with a few clicks. That’s like magic.
An experience Python Developer would know how satisfying that is. Since for the same visual content, they have to write multiple lines of code. Yet it doesn’t give that exact look and feel the Developer wants to have. But Tableau just makes it so much more easy. Upload and Watch.
While at first it extracts the Custom Dimensions itself, one can even set it according to the our requirements. Anyone looking for a career in Visualization, Do use this software. I simply Love it. Would soon write a special article on these one.
It comes with Microsoft Excel Package. These software isn’t new for anyone. As a school Student, all of us have used it. But recently Microsoft has provided a lot of features into the software for Data Processing. I have never really used it before these year, since It was never really needed for my work.
After working with so many csv & xls files for Machine Learning, I have finally realized the importance of these tool. Besides, all the employees in the company I currently work in use Excel in one or the other way. Most of them work with the Excel data and do filters, use custom operations.