Skip to main content

Posts

So, WTF is Artificial Intelligence Anyway?

Image By Seanbatty (Pixabay) According to Encyclopedia Britannica , artificial intelligence (AI) can be defined as: "The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. The term is frequently applied to the project of developing systems endowed with the intellectual processes characteristic of humans, like the ability to reason, discover meaning, generalize, or learn from previous experiences." By now, we have all heard about how AI can make it possible for computers, machines and other electronic devices to perform increasingly complex and human-like tasks. As all this sounds almost like magic, with machines performing increasingly complex tasks —from new gaming computers to self-driving cars – in reality most of AI technologies rely on the blend of software methods and technologies that imply collecting, processing and recognizing patterns within large amounts of data. So, how does AI W

The BBBT Sessions: WhereScape

Originally founded in 1997 in Auckland, NZ as a data warehouse consulting company, WhereScape has evolved to become a solution provider and —especially in the last five years— a key player in the data management market and especially in the data warehousing and big data spaces. During a great session with the BBBT, WhereScape showed their “latest and greatest” news and innovations, triggering meaningful discussions and interactions with the analysts of the BBBT. Here, a summary and commentary of that cool session. WhereScape at a glance As mentioned before, during an evolution process that expands for more than 20 years, WhereScape became a solution provider of data infrastructure and automation solutions, it currently offers three main solutions: WhereScape 3D . A solution to assist in planning, modeling and designing data infrastructure projects as well as enabling rapid prototyping. WhereScape RED . A solution to enable fast-track development, deployment and operation

Informatica Partners with Google Cloud to Provide AI-Driven Integration

As cloud computing adoption continues to grow, so  the need for modern and more efficient business and data integration capabilities. And while many aspects of business and data integration are being simplified and automated, the increasing sophistication of business needs and the requirement of capabilities for performing highly efficient integration continuously are forcing organizations to make continuous calls for new and ongoing digital transformation efforts. In this vein, an interesting news came in just a couple of weeks ago when a partnership between Informatica , a big player in the integration platform as a service (iPaaS) and tech giant Google was announced . Of course, the mere fact that two major players in the software industry decide to partner is already something worth to listen to, but the partnership is also particularly interesting because it involves the provision of artificial intelligence(AI)-driven integration services, in an enormous effort from both

WTF is Deep Learning Anyway

Following on my previous WTF post on Machine Learning, it just make sense to continue in this line of thought to address another of many popular and trendy concepts. We are talking about: Deep Learning. So without further due, lets explain WTF is deep learning shall we? Simply put, and as inferred from the previous post mentioned, deep learning is one of  now many approaches to machine learning we can find out there, along the lines of other approaches like decision tree learning, association rule learning, or Bayesian networks. While deep learning is not new, was introduced by Dr. Rina Dechter in 1986, its until recent years that this approach have gained fame and popularity among users and particularly among software companies adopting it within their analytics arsenals. Deep learning enables to train the computer to perform tasks including recognizing speech, identifying images or making predictions by, instead of organizing data to run through predefined equations, sets

Oracle’s New Cloud Services: A New Big Push for Automation

With a recent announcement Oracle, the global software and hardware powerhouse follows on its continuing effort to equipe all the solutions from its Cloud Platform with autonomous capabilities. As part of a venture that started early this year with the announcement of the first set of autonomous services —including Oracle Autonomous Data Warehouse Cloud Service — and the announcement of Oracle 18c to be Oracle’s first fully autonomous database— the company is now extending these capabilities with the launch of another set of services in the cloud. This time the turn is for three new services: Oracle Autonomous Analytics Cloud , O racle Autonomous Integration Cloud , and Oracle Autonomous Visual Builder Cloud which, according to Oracle, will be followed by the release of more autonomous services later through the year and which will be focused on mobile, chatbots, data integration, blockchain, security and management,  as well as more traditional database workloads including O

Hadoop Platforms: The Elephants in the Room

"When there’s an elephant in the room introduce him" -Randy Paush It is common that when speaking about Big Data two major assumptions often take place: One : Hadoop comes to our minds right by its side, and many times are even considered synonyms, which they are not. While Big Data is the boilerplate concept that refers to the process of handling enormous amounts of data coming in different forms  (structured and unstructured), independent of the use use of a particular technology or tool, Hadoop is in fact, a specific open source technology for dealing with these sort of voluminous data sets. But before we continue, and as a mind refresher, let’s remind ourselves what is Hadoop with their own definition: The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering