Rental Demand Forecast Analysis using Python
Supply & Demand forecasting is a hot topic in the machine learning community. The importance of operations management is widely understood for goods, commodities, and even in the service industry. Supply chain disruptions like machinery breakdown, quality concerns, inaccurate inventory records, poor forecasting, capacity, or labor shortages cause millions in losses. Machine learning algorithms are being…
Plotting US Hurricanes Path using Python
A hurricane is a type of storm called a tropical cyclone, which forms over tropical or subtropical waters. Central Florida Hurricane Center ( https://flhurricane.com/ ), stores data on all Hurricanes in the US. The data contains Date & Time Latitude Longitude Wind Pressure Movement. The above data can be downloaded in a CSV format, to be…
Native REST API – Azure Data Flow Component
Azure Data Factory and Azure Synapse Analytics now supports REST API data flow component for both source and sink capability for only JSON payloads currently. As part of code-free design, REST endpoints can be directly incorporated into data workflows to transform and process data in pipelines. To demonstrate the new feature, we will simply add…
SSMS Top N Rows
Whenever we work with Azure Synapse through SSMS, we want to quickly preview the top N data. A quick way of achieving it is right-click on the table / view and select but, sometimes we don’t need the 1000’s of rows. Maybe 100 or 10 is sufficient enough. To change the default 1000 rows count,…
KNIME Data Science Orchestration with Neo4j
“Sell me this pen” – an urgent business request. Data scientists have to spend a large amount of time wrangling data from different sources and running ML algorithms. With KNIME’s Neo4j Connection node, data science teams can build an end-to-end data science orchestration pipeline. YouTube Video from Neo4j 2021 Developer Forum -> https://www.youtube.com/watch?v=NDG9lYbxP2U
KNIME <-> MongoDB Nodes – Updates
KNIME 4.4.0 release on June 29th 2021, gave a new approach for MongoDB Integrators. In my previous blog post, I mentioned about a error exception when writing data containing “#” to MongoDB documents. This is now fixed in KNIME 4.4.0. Also, all the MongoDB nodes are upgrade. The new MongoDB Connector nodes gives the flexibility…
KNIME <-> MongoDB (Save) – Part 5/5
We are at the last part of the KNIME <-> MongoDB series. In this last part, we will look at MongoDB Save component. MongoDB Save saves modified documents that were read from the previous pipeline, based on the _id field. If the _id is already present, the document is updated with the new field, else,…
KNIME <-> MongoDB (Update) – Part 4/5
If you are reading this 4th part blog, I am sure, you would have already read csv files, imported them and even deleted some of the documents from your Mongodb through KNIME. If not, visit my MongoDB Reader, MongoDB Writer and MongoDB Remove blog post. In today’s 4th part, we will discuss how to update…
KNIME <-> MongoDB (Remove) – Part 3/5
Welcome back, to my KNIME <-> MongoDB Components 5-part series. Today, in this 3rd part, we will discuss how to use MongoDB Remove component from KNIME. MongoDB Remove Workflow Overview MongoDB Remove component, is pretty straight-forward, it accepts data as a JSON format, executes remove operation(s) on the collection/database. In our demo, we have to…