Many of our customers across Canada – in Calgary, Edmonton, and Saskatoon have been switching over to the budget friendly Microsoft Power BI toolset for reporting and data analytics.
There are lots of things to love about this easy to use tool but when I think of Power BI functionality I think Microsoft must have asked the question “What are the key features (the 20%) of all the best reporting tools that 80% of the users use all of the time?” Then they went to work delivering only that 20% and very little more. I think of this when I see users trying to print a report or export to PDF. You cannot do a full printout of a report – but only a screen capture of the first page. Maybe the intention behind this was to really get more users logging into the reports online, maybe the idea is to save trees and reduce printing. Or is this is another example of Microsoft reducing development efforts and choosing to implement only bare necessities?
Another example of this is the ability to add in subtotal within a chart – but then not being able to customize the name of that subtotal to include the field that is actually being subtotaled. So you end up with something like this:
This works, you can see the total, but if you have a long list of cities you have to scroll back up to see that you’re looking at Alberta and not British Columbia or Saskatchewan. There is no was to customize the total to ‘Alberta Total’ for example. We can’t complain too loudly, and there are often
In the next post I’ll focus more on what Power BI can do and how to really get some of the best use out of this great dashboarding tool. For now though I will go on believing that the Microsoft management team chose to implement features thinking of the 80/20 rule and what 80% of the reporting features are used rather than developing the perfect tool. In doing so they have successfully delivered a very popular tool with many features and just a few limitations.
How many ways have you been thinking of to improve your company? The company suggestion box might be empty but many employees easily recognize ways to reduce costs and save their company money. Sometimes the most difficult part in accomplishing the changes needed to reduce costs is proving that it will make a difference. We’ve been helping customers in Calgary, Edmonton, Saskatoon, and Victoria reduce their costs by managing by numbers and making decisions based on facts. Here’s how we’ve helped others using Business Intelligence and Data Analytics.
Shifting a company over to fact based decisions starts with a reliable system to see those facts. Having that reliable single source of the truth is key for being able to unify an organization and bring teams together based on their data. Once that system is in place then it is possible to make decisions and reduce costs based on evidence provided by data.
By setting up systems to easily visualize company information it becomes an easy task to identify costs by departments and see where money is being spent as an investment that helps to increase company revenue versus money that is being spent that is not getting a positive return on investment. These reports can be scheduled to go out to the responsible managers so they can better manage the numbers and take action. If you need a special report to add to a presentation and get it in front of your management board, that is possible with little effort using business intelligence. We’ve had business customers create new reports while their manager was still standing in the doorway of their office waiting for an answer. It is possible to get this kind of information quickly with self-service reporting without having to make the request and wait a week for a response. This kind of self-service reporting is common-place now; it is possible to reduce the load on the IT department while still living within good data governance practices.
If you know the information is available but you are having trouble pulling it together to prove that with a little effort dollars can be saved, we can help. We’ve helped other companies pull together reports and even distribute these directly to managers on a regular basis so better decisions can be made. Getting close to the budget? Want to know what was spent last year compared to this year to date? Need to understand the project costs better in order to support the next project? We can set things up and automate the whole process so each new query can be answered with self-service proficiency. It can be as easy as opening up an Excel spreadsheet – or in fact much easier.
There really is an explosion of data and often not enough time or resources to make the most out of it. By automating processes around managing the data and using new tools with Artificial Intelligence built-in it is possible to gain a real competitive advantage without being overwhelmed by data. It used to be that we dreamt of a day when systems were easy to set up and provided the answers quickly. Rapid responses without spending two years building a data warehouse to pull it all together and then still needing an IT degree to get the answer to the latest query. The future is here now and you can log in, upload a file and get answers immediately – and start making fact based decisions for your organization today. Ask us how to help you get started with a free trial of IBM Watson Analytics and get the answers you need. Finding the proof of the changes can be that easy. Contact us at email@example.com or (587)-885-1090.
Business Analytics Moving From On Premise to Cloud
Business Analytics is just one of the software applications used by companies that has been moving to the cloud over the last several years and this trend is gaining speed. We have customers in Calgary, Edmonton, and Saskatchewan that have already moved to the cloud or are in the process of moving to the cloud. Why are they doing this? How are they doing this? I’ll explain a few of the reasons here.
Cost Savings in the Cloud
There are several reasons for the move to the cloud but really the driving force tends to be cost savings. When considering the cost of servers plus the cost of full time employees to maintain those servers the savings becomes quite clear to push these capital and operational expenses to a monthly operational service charge that can be relied on as a consistent cost going forward.
How to go to the Cloud
When companies are making the decision to move to the cloud they have options – go to the software vendor such as IBM and buy their SAAS version of the software they need like Cognos Analytics on the cloud and pay their monthly fee to use the software without having the costs of upgrades etc. Another option is to simply keep their version of their on-premise software but then install it on the cloud on servers like Amazon’s. This doesn’t present the same software application cost savings but does add back some control into the company’s hands.
Why Not Move to the Cloud
On the other hand – why would we not move from on-premise to the cloud? The main reason why companies are not moving to the cloud is typically related to security. As well, there are some industries that the location of their data is highly regulated. For instance due to privacy regulations it is mandatory that all data regarding children in Canada remains in Canada. There are cloud servers that are located in Canada but this is one hold up for some industries related to childhood education. In addition security is a forefront for most industries
Help in the Cloud
Need help justifying moving your business analytics to the cloud – or need business analytics and the cloud will be your first stop? Let us know. We’re happy to help you experience life in the cloud! Contact us at firstname.lastname@example.org or (587)-885-1090.
Whether you know you’re going to need BI or you’re just trying to determine if it’s worth the cost you need to know what the costs really are. We’ve worked with customers in Calgary, Edmonton, Saskatchewan and British Columbia to come up with a budget for their Business Intelligence projects and find ways to minimize that expenditure. Here are some factors to look at when setting your budget for your analytics project.
Sources of Data
Initial project size
Number of recipients
Internal versus External Resources
Type of Software and Brand
On-Premise or SAAS
Real time Versus Near Real Time Versus Batch
Sources of Data
Sources of Data – since the whole purpose of business intelligence is to analyze data this is a good starting point. Do you know where your data is stored? Who has access to it? Is it in an easily readable format? What are the volumes of data? Are you looking to compare to external data like weather trends or social media? Knowing where your data is and how many sources is key to getting a feel for how long it will take to pull things together and provide access to the data. Multiple sources can make for a bigger budget – but also a greater value provided to end users who can then get a complete picture of the organization and impacts.
Initial Project Size
We always recommend starting small and building up to bigger projects. This reduces the risk to a limited amount and if things to go wrong the team can quickly learn from the issues and be on track for bigger projects in the future. Projects that can be delivered in 30, 60 or 90 days are ideal for starting out. Once the team has successfully delivered these short term projects then it may be time to move onto larger deliveries. Proving out the return on investment is key regardless of size but starting out with a short term delivery is helpful in many ways. Budgeting for a 30 day project is also much easier than determining the budget for a long multi-year project.
Number of Recipients
Paraphrasing an idea from “Think and Grow Rich” is the idea that the most success can be found by helping the most people. However, a bigger audience will increase the costs of your project both with software costs increasing and in the time and effort to gain consensus on each issue. The benefits of helping more people within your organization do have to be weighed out with the increases in costs. You can provide BI to a greater audience – but we recommend baby steps to get there to help keep costs to a minimum and to gain momentum.
Internal versus External Resources
A separate post needs to be written to cover all of the pros and cons of internal and external resources for BI projects. A general approach is to get external resources to help get you off the ground quickly and get the initial projects delivered, while training internal resources. This approach helps to set best practices and reduces costs over the long run. Once that project is delivered then maintenance and upgrades need to be considered.
Some companies are too small for a full service IT team for maintenance and ongoing projects – in this case it is helpful to outsource this type of role. When a task like adding a user is only done once every six months or so it’s easy to forget the steps and sometimes easier to call on external resources to complete these small support tasks rather than have a Full Time Employee assigned to do ongoing support when it comes up occasionally. Again, many considerations for resourcing but helpful to take a long term perspective even when looking at small projects as the costs can add up in a variety of ways.
If you have an experienced internal team it can be a great way to reduce costs over bringing in expertise – if not then a balance of internal and external can help keep costs down while delivering an advanced system on time and on budget.
It’s no secret that some software is more expensive than others. When looking at which one to go with you can look at the Gartner Magic Box for a good idea of the strengths and weaknesses of each product. There are many considerations for evaluating the software for a new project – do you already have software installed? Are you an Oracle/IBM/Microsoft etc. shop? Who will be maintaining it? How easy is it to get resources to work with this software? Is the software sold on a concurrent user basis or named users, PVU (size of your server) or other?
Software licensing is often different depending on the type of software – it’s good to ask how it’s sold and if there are better options for you depending on the number and type of users you are expecting. There are also Express versions of most software for small deployments or proof of concepts – or Software as a Service that reduces the upfront cost and is often better for companies who don’t want to handle software upgrades but just want to jump right into data analysis.
When looking at software costs it is important to remember there are often renewal costs to deal with. (Conditional on whether the software is on-premise or SAAS). Often renewal costs fall in the range of 20% of the initial purchase and increasing each year. Just one more item to consider when planning your BI project budget.
On Premise or Software as a Service (SAAS)
As mentioned above Software as a Service is an option for BI projects. This is a big consideration for project costs – do you want an on-premise solution or SAAS? There are many different considerations here; upfront costs, ongoing maintenance, data security and policies about the location of company data. As well this could impact whether the cost becomes an operational expense or a capital expense.
SAAS can significantly reduce the upfront cost of a project and in turn reduce the risk of the project, however there are other considerations like how the expense affects the financial books.
Big data is completely relative to the systems that are handling the data and quite simply bigger volumes mean more work in pulling the information together and in hardware storage costs – even though these costs have been continually coming down in price. Do you have hundreds of SKU’s or millions? Tens of customers or thousands? Sensor data that’s coming in by the millisecond or once a day or once a month? Frequent updates tend to create larger data volumes and more work in managing those data volumes. In turn this has an impact on your budget. Having an idea of your data volumes at the outstart of your investigation will help you decide on software and the impact on your budget.
Budget and timeframes go hand-in-hand and often can be balanced against one another. If you need this information yesterday the costs will be high in getting things together quickly and pulling a team together to complete all the work upfront. Often this is a matter of spreading out the costs or facing them upfront. However, if faster means you have to rely more on external resources and high-value expertise this could add additional costs to your project, particularly in comparison to slowly building an internal team to complete the tasks at hand.
Real time Versus Near Real Time Versus Batch
Real time data or the ability to see how things are changing as it happens is critical for alerts for equipment, servers, etc. and allowing for early warning of changes in your business. At the same time, it’s not always helpful to have real time data; when a report is constantly changing it means that every conversation is dependent on who looked at the latest report. This can result in individuals doubting the reports or having unnecessary conversations about the results simply because they changed since they were last looked at. Month over month and year over year reporting rarely needs real time data. Knowing how old your data can be is critical in deciding the type of hardware and software needed for the project. This is a key item to understand when planning your BI budget.
Understanding all of the above considerations will help mitigate cost overruns. Despite this, when starting the journey of understanding your business data things do come up. This may be as simple as extra discussions surrounding business rules that were believed to be well understood but the data shows otherwise, or unforeseen issues like network problems when moving large volumes of data. Other considerations include backup frequency and network/hardware infrastructure.
It often takes many months to get a full BI project started with a budget established and approved, a timeframe set, and an expected return on investment. Despite the length of time it takes these projects, it can be very important for companies to jump into business intelligence and key to gaining a competitive advantage in many industries. As well understanding these budget impacts will help to mitigate the risks to the project in general and the budget specifically.
Ask us for more information and how we can help you to make the most out of your BI project. Send us an email at info@BowRiverSolutions.com or call at (587) 885-1090
Good Cop Bad Cop – Getting Your Data To Spill The Beans
We’ve been working with clients throughout Western Canada – including Calgary, Saskatoon, Edmonton, Victoria and other cities – to help them make the most of their Business Intelligence software purchases and get real use out of the mountains of data that we are surrounded by these days. There are several approaches possible to make the most of your data more than a good cop/bad cop approach.
Exploring your data through Business Intelligence helps to eliminate missed opportunities and gain a competitive advantage. There are many elements to Business Intelligence that help in this regard. Here are just a few examples:
Understanding how project costs are accumulating will help to keep track of projects as they progress.
In addition a thorough understanding of project costs will ensure current and future project success.
Understanding social media and what your customers are saying about your business and industry can give you insight into how to best spend your time.
Effective BI systems help to understand all areas of a business – finance, operations, sales and marketing. Finance with year over year reporting with easy slice and dice by business area and managers, operations – for ahead of the game quality control, sales for on the road reporting and customer understanding, and marketing for campaign analysis, and so much more.
With Business Intelligence and data warehousing we are able to join isolated silos of data to provide the full view of organization.
We develop solutions that answer business questions and put the control into your hands where it is most needed so you can do your job quickly and effectively. By having self-serve access to your business information you don’t have to put on the bad cop attitude – you can simply answer your own questions and start taking action. This isn’t a pipe dream; ask us how we’ve been able to help other companies to get the data into the hands of those that can make decisions so they’re able to make fact based decisions. Contact us today at email@example.com or (587)-885-1090.
Cognos Access Manager to Active Directory
Several long-time Cognos customers in Western Canada are still using Access Manager to manage security for their IBM Cognos BI environment. (You’re not alone!) It was a great tool that has some handy advantages for administrators, however it is outdated and there are many reasons to move over to Active Directory to manage security going forward. This includes compliance with other server upgrades.
We’ve been able to successfully help our customers in Calgary and Victoria make the transition to Active Directory smoothly and easily. Here are the steps we followed and some rough timeframes in case this is on your to-do list in the near future.
Steps to Migrate
Purchase and install Avnet BSP Lifecycle SMSS software (we can help)
Recreate Access Manager security structure in Active Directory
Validate accounts and mapping between AD and Access Manager
Migrate each environment (typically Dev, Test, and Production)
Basic functionality and user access needs to be tested. However the most important part of testing is to ensure all My Folder objects are available under the new security model.
Who should test depends on the environment but it is recommended that each user logs in and checks that their My Folders objects exist in the migrated environment
Keys to Success
Minimize changes to accounts
Freeze changes to Access Manager so account validation only occurs once
Key user involvement for testing and sign-off
There are several factors in the timeframe of this type of project, including how many other changes are being made at the same time, however if this is the only project and the basic security structure will stay the same then we can say moving one environment with an average number of objects will take 2-5 days, increasing the number of objects means an increase in the overall duration project. Testing can also increase the overall duration as it may take several days and even weeks before all users have approved the change and confirmed their My Folders objects exist and nothing is missing.
It can be so much easier than you think to move over to Active Directory and shut down the old server that’s supporting Access Manager. Ask us how we can help. We can even do this remotely – we can help you wherever you are, not just Calgary, Alberta Canada. (That means US too, and with the Canadian dollar exchange rate we can make this very easy price wise too.)
We would be misleading you to suggest that this method is the only option for moving over to Active Directory – there are other software options and there is a manual approach, however this is the one that we recommend and can back up with happy customers.
Manually moving over accounts introduces greater risks of missing objects, missing scheduled tasks, etc., in addition the time it takes would quickly add up to the equivalent cost of the software purchase unless there are so few users and My folders objects that it wouldn’t be a significant amount of work or lost objects anyway
Other software options are available and should be considered however we like the Avnet BSP Lifecycle SMSS tool for the simplicity of moving all account objects directly over to the new Active Directory accounts. Other options provide an extra layer of mapping between the old and new accounts – this works well but again, we like the simplicity of running this once and being completely transitioned over to Active Directory.
Are you ready to move your Cognos security over to Active Directory? Give us a call and we can set up a meeting to review your environment and come up with some guidelines for making this move. We can meet locally in our Calgary downtown office, in your office or via a conference call.
Artificial Intelligence sounds like a futuristic world where robots are going to take over. The reality is the future is now and we have the ability to make use of AI in our every day lives and in our work in particular. AI is a broad concept covering machines that have the ability to complete processes that we would consider smart. Machine Learning takes this a step further in that it entails machines accessing data and learning for themselves rather than being programmed to understand and/or complete tasks.
Using IBM’s Watson Analytics it is possible to ask a natural language question and get answers without having to program the system to come up with that specific answer. The system then learns what is more appropriate and helpful as answers are selected and the picture of the information is refined. Future questions are answered faster and even more appropriately. This really is the smarter future where information is reported on as seamlessly as new movies are recommended within Netflix. In fact we are already so entrenched in the future of Artificial Intelligence that it seems common place. We even think it’s a pain to shop on a site that doesn’t recommend a better alternative when we’re trying to find that specific item we need. We expect the website to use AI and ML to suggest what other customers have bought or similar items that we might be interested in.
Over the past 5-10 years it has become a common question – why can’t I get ‘my report’, ‘my information’, ‘the answer’ (you fill in the blank) not only as quickly as Google but with a similar understanding and reasoning of the question as Google. Now with systems like Watson Analytics it is possible to ask a question in your own words and get an answer that has reasoning and logic built into it – a truly smart answer.
While many machine learning calculations have been around for quite a while, the capacity to consequently apply complex numerical counts to enormous information – again and again, quicker and quicker – is a current improvement. Here is a couple of broadly announced cases of machine learning applications you might be acquainted with:
The intensely advertised, self-driving Google auto? The path of machine learning.
Online suggestion offers, for example, those from Amazon and Netflix? Machine learning applications for regular day to day existence.
Knowing what clients are saying in regards to you on Twitter? Machine learning combined with a real understanding of words and their meanings.
Why is machine learning vital?
Resurging enthusiasm for machine learning is because of similar variables that have made information mining and Bayesian examination more famous than any other time in recent memory. Things like developing volumes and assortments of accessible information, computational handling that is less expensive and all the more capable, and moderate information stockpiling.
These things mean it’s conceivable to rapidly and consequently create models that can break down greater, more unpredictable information and convey speedier, more precise outcomes – even on an extensive scale. What’s more, by building exact models, an association has a superior possibility of recognizing beneficial open doors – or evading obscure dangers.
Oil and gas
Finding new well locations. Breaking down minerals in the ground. Anticipating refinery sensor disappointment. Streamlining oil appropriation to make it more proficient and practical. The quantity of machine learning use cases for this industry is tremendous – and as yet growing.
Breaking down information to distinguish examples and patterns is critical to the transportation business, which depends on making courses more productive and anticipating potential issues to build benefit. The information investigation and displaying parts of machine learning are imperative devices to conveyance organizations, open transportation and other transportation associations.
Machine learning is a quickly developing pattern in the medicinal services industry, on account of the coming of wearable gadgets and sensors that can utilize information to survey a patient’s wellbeing progressively. The innovation can likewise enable restorative specialists to break down information to recognize patterns or warnings that may prompt enhanced analyses and treatment.
Showcasing and deals
Sites suggesting things you may like in light of past buys are utilizing machine figuring out how to dissect your purchasing history – and advance different things you’d be occupied with. This capacity to catch information, dissect it and utilize it to customize a shopping background (or actualize a promoting effort) is the fate of retail.
Banks and different organizations in the money related industry utilize machine learning innovation for two key purposes: to distinguish critical experiences in information, and avoid misrepresentation. The experiences can distinguish speculation openings, or enable speculators to know when to exchange. Information mining can likewise distinguish customers with high-hazard profiles, or utilize cyber surveillance to pinpoint cautioning indications of extortion.
For customers of ours in Calgary, Edmonton, Saskatchewan and British Columbia we’ve been using advanced analytics to boost profit, reduce costs, and save time. Send us an email at support@BowRiverSolutions.com or call at (587) 885-1090 to request your FREE Watson Analytics access and like thousand of others, give Watson analytics a test-drive!
This quote is an interesting perspective on the importance of data in our digital society. This isn’t a particularly new quote but is very relevant for businesses in Calgary, Edmonton and other areas impacted by the energy industry. Our interaction with the digital realm has become an ingrained part of our daily lives, affecting how we interact socially, how we shop and access services, and how we conduct business. Data plays a critical infrastructure role for businesses both internally (being able to accurately forecast and track sales, manage and analyze inventory), but also externally (managing customer relations to improve sales, building a brand’s identity, delivering targeted advertising). To remain competitive and innovative, companies must stay up-to-date on managing the flow of data: how it is captured, analyzed, and leveraged to inform solid decision making. Forward thinking organizations are interested in making the most of this data resource as a means of reducing costs and operating efficiently in dynamic markets. The need for data analysts to interpret and turn data into actionable knowledge will continue to grow as organizations struggle to keep up with the volume and complexity of big data.
Business analytics/ business intelligence: propelling business forward
Business Intelligence has several functions and can be used in the corporate environment to detect fraud, reduce wasted resources, increase revenue and save time. Perhaps most importantly, it empowers decision making at all levels of management, moving away from “gut” or instinct-based decision making that can lead to sub-optimal performance. Key functions of business intelligence include advanced reporting and analysis capabilities, and the ability to compare data to improve tactical and strategic management. Data that is properly managed can be an incredibly powerful tool, and the industries that are adopting business analytics are growing. In the restaurant industry, for example, data analytics can be used to predict how busy a restaurant will be based on factors such as weather, special events (if there is a conference in town, sporting event, or concert), and seasonal demand. With this demand forecast, the restaurant manager can then accurately plan the number of staff to schedule and how much food to order, reducing operational costs, increasing profit margins, and offering a better guest experience. Businesses that know how to leverage their data have a competitive edge against their competitors and are able to plan dynamically according to data trends.
Ask us how we can help you make the most of your data commodities and how our customers in Calgary, Edmonton, Saskatoon, and British Columbia are saving time, reducing costs, and increasing revenue with business analytics. Contact us at firstname.lastname@example.org or (587)-885-1090.
It’s happened more than once that we have had customers call us in after struggling to implement their analytics software for a year or more. Although we’re happy to be able to help, it would be nice to see more success for these projects early on. Based on what we have seen it is not surprising to read the McKinsey study results showing 25% of companies have felt their analytics activities have been unsuccessful and a full 86% suggest that their analytics efforts have been only somewhat effective (or worse) at meeting their primary objective.
So what’s the problem and what do we do about it? How do we get the promised Return on Investment (ROI) that is promised when analytics sales are made?
Meta S Brown has some great tips for avoiding the downfalls and getting success out of your next analytics project in this article.
I’d like to reiterate the point that Meta makes regarding starting small. Having a success out the door is much better than trying to solve the biggest problem with the biggest potential impact as the first project. This start small approach is ideal to run with Agile project management. Gain some success quickly and build up the user buy-in.
Speaking of user buy-in, this is a key part of the process for a successful project. This was mentioned in our Agile Data Warehouse blog post but it’s worth reiterating here. Gaining support from end users is great for showing immediate value in the project but also for the support of future projects. Staying away from the “if we build it they will come” approach is key in driving value out of any analytics project.
Data driven decisions can be a big key to market differentiation and gaining a competitive advantage. There is a valuable ROI to be had by implementing analytics wisely. Ask us how we can help put you on the success side of the statistics for showing ROI from analytics projects!
Feel free to ask us for guidance in getting the most out of your Analytics Projects. We are happy to help with a plan to help you find analytics success. Contact us at email@example.com or (587)-885-1090.
You may not be able to predict the winning lottery numbers but with advanced predictive analytics options today you won’t need to. That’s because it’s possible to increase your company revenue by predicting trends to be successful in a more traditional sense than winning the lottery. By analyzing your data you can predict customer churn, or employee churn, you can visualize the trend of customers most likely to purchase and focus your sales efforts in those areas. You can determine the best place for your next business location and so much more. The possibilities are virtually limitless. In addition to increasing revenue, there are also practical ways to reduce costs with highlighting areas to reduce waste, reduce downtime, or optimize maintenance practices.
Predictive Analytics can be used to improve Operations, Sales, Fraud detection, Marketing, Crime Prevention… and more!
Not Magic and Not Just for Technical Wizards
We have been helping our customers evaluate predictive analytics solutions for many years and recently there is a new option from IBM that provides many benefits and is available with virtually no learning curve. Watson Analytics allows any user with a data set to quickly visualize unbiased predictions. Cognos SPSS is another great tool with a wide variety of predictive options that get much more in depth than Watson Analytics. So which one do you need? Watson Analytics allows a free trial for the complete version for the first 30 days and a limited version after the 30 day trial. As well you can try a 30 day free trial of SPSS. Using either of these tools you can easily use historical data to predict what will happen next and visualize trends.
Here’s how to get started with each and where you might want to look first.
Watson Analytics is known for its ability to provide a user friendly interface for analyzing data for any user – without needing a technical background. With natural language processing it is possible to easily upload data and start analyzing without any of the work to manipulate data, or write cryptic code. Suggested reports and analysis are provided automatically with the capability to drill into further detail. The power of this data analysis on its own is impressive particularly when it is so easily accessible. However, an important additional feature is the ability to look into trends and predictions based on the same data, again, without the need for a technical background. You don’t need to have a computer science degree or a statistician background in order to get real benefits from this tool. With Watson Analytics you will be able to visualize trends and patterns or correlations and details for questions that you didn’t even know to ask.
When creating a prediction in Watson Analytics you can choose up to 5 targets for the prediction for example customer churn would be one target. Once the targets are selected Watson Analytics automatically runs through thousands of algorithms to find the best model and likely predictors. You won’t need to know what type of model to select for the best statistical fit and you won’t end up with a biased prediction based on your own hypothesis.
Click here to get access to your free trial and get started with uploading your data and see what you learn.
Then follow this post in the Watson Analytics community to help you get you started with your first prediction. You’ll be on your way to visualizing trends almost immediately.
The trial version also includes the Watson Analytics Social Media analytics for ten days – giving you the ability to analyze marketing campaigns and more by pulling in data from twitter, review pages, blogs, and more. This gives you the ability to be The Fly on The Wall and to understand what your customers or community are saying about your company and products. This is a powerful tool providing insights and answers into sentiment, demographic information and more.
Watson Analytics is a cloud based offering allowing you to upload your data immediately after signing up for your trial version. Try it today and let us know what you think!
IBM SPSS Modeler
IBM SPSS Modeler is a user friendly tool that you can learn to use to create accurate predictive models for valuable insights. You can use the tool to drag and drop and create the work flow you need without programming, this simplifies the process of building models and visualizing results.
SPSS Modeler is the perfect tool for when you’ve outgrown the capabilities of your Excel spreadsheet and want to do more with predictive analytics but without signing up for a course in programming or coding. This is a user friendly environment that will allow you to draw out, analyze, then communicate and validate results to change processes, policies and systems through predictive intelligence. You can automate data preparation and modeling, access a full range of modeling algorithms and choose which one best suits your needs and get the predictive intelligence results you need. In the chance that you are a technical wizard and want a highly customizable or scientific model you can easily embed a model available in open source (R/Python).
SPSS is a true leader in predictive analytics showing up in the top corner of the Gartner Magic Quadrant for completeness of vision and ability to execute. This is a tool that will help you improve decision making through accurate predictions and discovery of data patterns. SPSS includes text analytics, entity analytics, decision management and optimization for near real-time insights. SPSS modeler can be used as a stand alone desktop installation or a larger deployment integrated within the whole organization. This can be an on-premise, cloud, or hybrid solution.
You can create models to help you understand customer churn, retail market basket analysis, identify fraud, predictive maintenance, and more. With an SPSS model you can see specific actions to take per customer to avoid customer churn, or an approach for avoiding fraud before it happens. You will be able to take immediate action with record by record analysis of predictions and correlations with probability of a target displayed for each record. Empower yourself to make effective decisions and take immediate action with predictive intelligence and SPSS modeler insights.
For your 30 day Free SPSS trial version click here. You can get started right away with a drag-and-drop model that will allow you to apply predictive intelligence to your business data. Feel free to talk to us about how you’d like to use predictive intelligence and we can help point you in the right direction that will help to increase revenue, reduce costs, and/or save time.
Predictive Analytics without the Pain
Whether you’re using Watson Analytics or Cognos SPSS you can quickly increase your revenue and reduce costs by seeing the future trends in your data.
Feel free to ask us for guidance in getting started with either Watson Analytics or IBM Cognos SPSS. We have people that are happy to help with either product and provide suggestions for making the most out of the 30 day trial versions or ongoing tool evaluation. Contact us at firstname.lastname@example.org or (587)-885-1090.