Friday, October 24, 2014

Before Things Go Out of Control

Solutions Consultant, Don Mast, talks about providing more safety in oil refineries across the country. One way to do this is through the tools Canary Labs provides:

With increased focus on the booming oil and gas industry by the media, the government, groups against anything gas, and health & safety industrial organizations, more needs to be done to prevent accidents upstream, midstream and downstream. 
Refineries convert raw materials into usable product.
Fact: Workers at American oil refineries die on the job about three times as often as their counterparts in other countries.  One example of this is Tesoro, a Washington state refinery that exploded in 2010, killing seven people.  Recently the U.S. Chemical Safety board who investigates our nation’s worst industrial accidents says, that on average, across the nation there’s a significant accident at an oil refinery once every three days.  Companies’ report all the time that they are missing data or chunks of time and when that data is gone, bad things can happen including injuries, downtime and massive fines.  With all the technology available, this can be reduced and prevented.  The answer is data.
Canary Labs, a leader in information solutions for the oil and gas industry, is working with global oil and gas companies, pipelines, drillers, and refining companies to protect workers and equipment, reduce production and exploration costs, improve process and production metrics while providing the data needed to drive crucial decisions before things at the job site go out of control.
Utilizing the latest technology, Canary Labs open, flexible and high performance enterprise software runs around the clock to monitor equipment and process metrics by collecting data via the Canary Historian and presenting both real-time and historical data trends, allowing operators to have a good understanding of the current plant / job site conditions.   Similarly, Canary Labs software tools provide operators access to invisible variables that are not easily detected; providing early warning alerts to problems that can be resolved before they become a disaster.  Solutions are deployed as a simple single site solution or as a complex distributed enterprise resource. 
Driving innovation in the data industry, Canary Labs technology allows operators to collect data from distributed or remote sites and safeguards that the data are sent to the historian in event of a communication outage.  Furthermore Canary’s Axiom trending solution, AxiomTrend, is an intuitive and easy to use data analysis tool. It visually transforms process data into knowledge. It empowers operators, engineers, and managers to stay connected while using a common analytical tool for viewing data anywhere at any time via mobile devices.  With increased scrutiny on the industry by governments and anti-gas groups, Canary Labs provides streamlined detailed reporting for regulatory compliance. 
“Canary Trending is an invaluable tool in our refinery. It runs around the clock, 24/7. Over the last 14+ years, Canary has provided quality products that utilize the latest technology.” Kevin Moran, System Engineer, Delaware City Refinery “As a trending tool, we like the flexibility in how it’s configured and the ease of use. Users can drag and drop to add trends to the chart and save charts for later recall. The speed is a big plus, allowing users to quickly scan and find historical data they want to see. Users can access overview charts and then quickly drill down to see detailed data. The user doesn’t have to be a genius to figure things out.”
Randy Walker, Control Systems Engineer says, “Canary is a valuable tool allowing us to graphically review archived data for maintenance issues and performance. Templates can be saved of commonly reviewed trends for quick future access. The export utility is used to generate viewed trends into reports for distribution. This trending software has proven to be a valuable asset in our day to day operations.” 
For nearly 30 years, Canary Labs has been a game-changer in the oil and gas, industrial automation, energy production and distribution sectors delivering world-class, real-time data historian and trending tools with a focus on safety and solutions.  

Friday, September 05, 2014


8 Reasons Big Data Projects Fail


Most companies remain on the big data sidelines too long, then fail. An iterative, start-small approach can help you avoid common pitfalls.
Big data is all the rage, and many organizations are hell bent on putting their data to use. Despite the big data hype, however, 92% of organizations are still stuck in neutral, either planning to get started "some day" or avoiding big data projects altogether. For those that do kick off big data projects, most fail, and frequently for the same reasons.
It doesn't have to be this way.
[Want more big data advice on staffing? Read Data Scientists: Stop Searching, Start Grooming. ]
The key to big data success is to take an iterative approach that relies on existing employees to start small and learn by failing early and often.
Herd mentality
Big data is a big deal. According to Gartner, 64% of organizations surveyed in 2013 had already purchased or were planning to invest in big data systems, compared with 58% of those surveyed in 2012. More and more companies are diving into their data, trying to put it to use to minimize customer churn, analyze financial risk, and improve the customer experience.
Of that 64%, 30% have already invested in big data technology, 19% plan to invest within the next year, and another 15% plan to invest within two years. Less than 8% of Gartner's 720 respondents, however, have actually deployed big data technology.
That's bad, but the reason for the failure to launch is worse: Most companies simply don't know what they're doing when it comes to big data.
It's no wonder that so many companies are spending a small fortune to recruit and hire data scientists, with salaries currently averaging $123,000.
8 ways to fail
Because so many organizations are flying blind with their data, they stumble in predictable ways (including thinking that a data scientist will magically solve all their problems, but more on that below). Gartner's Svetlana Sicular has catalogued eight common causes of big data project failures, including:

Management resistance. Despite what data might tell us, Fortune Knowledge Group found that 62% of business leaders said they tend to trust their gut, and 61% said real-world insight tops hard analytics when making decisions.

Selecting the wrong uses. Companies either start with an overly ambitious project that they're not yet ready to tackle, or they attempt to solve big data problems using traditional data technologies. In either case, failure is the usual results.

Asking the wrong questions. Data science is a complex blend of domain knowledge (the deep understanding of banking, retail, or another industry); math and statistics expertise; and programming skills. Too many organizations hire data scientists who might be math and programming geniuses but who lack the most important component: domain knowledge. Sicular is right when she advises that it's best to look for data scientists from within, as "learning Hadoop is easier than learning the business."

Lacking the right skills. This one is closely related to "asking the wrong questions." Too many big data projects stall or fail due to the insufficient skills of those involved. Usually the people involved come from IT -- and those are not the people most qualified to ask the right questions of the data.

Unanticipated problems beyond big data technology. Analyzing data is just one component of a big data project. Being able to access and process the data is critical, but that can be thwarted by such things as network congestion, training of personnel, and more.

Disagreement on enterprise strategy. Big data projects succeed when they're not really isolated "projects" at all but rather core to how a company uses its data. The problem is exacerbated if different groups value cloud or other strategic priorities more highly than big data.

Big data silos. Big data vendors are fond of talking about "data lakes" and "data hubs," but the reality is that many businesses attempt to build the equivalent of data puddles, with sharp boundaries between the marketing data puddle, the manufacturing data puddle, and so on. Big data is more valuable to an organization if the walls between groups come down and their data flows together. Politics or policies often stymie this promise.

Problem avoidance. Sometimes we know or suspect the data will require us to take action that we don't really want to do, like the pharmaceutical industry not running sentiment analysis because it wants to avoid the subsequent legal obligation to report adverse side effects to the U.S. Food and Drug Administration.
Throughout this list, one common theme emerges: As much as we might want to focus on data, people keep getting in the way. As much as we might want to be ruled by data, people ultimately rule the big data process, including making the initial decisions as to which data to collect and keep, and which questions to ask of it.
Innovate by iterating
Because so many organizations seem hamstrung in their attempts to start a big data project, coupled with the likelihood that most big data projects will fail, it's imperative to take an iterative approach to big data. Rather than starting with a hefty payment to a consultant or vendor, organizations should look for ways to set their own employees free to experiment with data.
A "start small, fail fast" approach is made possible, in part, by the fact that nearly all significant big data technology is open source. What's more, many platforms are immediately and affordably accessible as cloud services, further lowering the bar to trial-and-error.
Big data is all about asking the right questions, which is why it's so important to rely on existing employees. But even with superior domain knowledge, organizations still will fail to collect the right data and they'll fail to ask pertinent questions at the start. Such failures should be expected and accepted.
The key is to use flexible, open-data infrastructure that allows an organization's employees to continually tweak their approach until their efforts bear real fruit. In this way, organizations can eliminate the fear and iterate toward effective use of big data.
When selecting servers to support analytics, consider data center capacity, storage, and computational intensity. Get the new Hadoop Hardware: One Size Doesn't Fit All issue of InformationWeek Tech Digest today (free registration required).
Matt Asay is Vice President of Community at MongoDB. He was previously VP of Business Development at Nodeable. You can reach him at mjasay@mac.com and follow him on Twitter @mjasay. View Full Bio

Wednesday, January 22, 2014

Whatever it Takes Reality TV Show Sizzle Reel with Grant Cardone



Grant Cardone also known for his business advice and reality tv show TurnAround king has produced and created a new reality show Whatever It Takes starring his wife Elena Cardone and his staff.

Cardone known for being for his uncoventional methodologies is using reality TV to hire employees. Grant Cardone is a NY Times Best selling author, international sales and business expert who writes for Entrepreneur Magazine, Business Insider and Wells Fargo. 

This show is a combination of Fear Factor, Undercover Boss, Apprentice, Punked The Profit and TurnAround King with a splash of Simon Cowell and Gordon Ramsey with the only difference that Cardone is giving real people real jobs.

Did you find me? Can't wait to watch! Blessed to be part of it! Fantastic experience. 

Must Read Article: Brand And Marketing Trends For 2014



By Robert Passikoff, Forbes Magazine


It was management consultant, Peter Drucker, who advised the best way to predict the future was to create it. Creating new things being difficult, the next best way is to have access to validated and predictive loyalty and emotional engagement metrics to help point the way. Happily, we do. And after examining over 100,000 consumer assessments, we’ve identified 14 critical trends to help marketers create their own, successful futures next year...

    Friday, January 03, 2014

    Great Business Reads To Get 2014 Started With A "BOOM!"


    1. Gary Vaynerchuk - Jab, Jab, Jab RIGHT HOOK 
    2. Marc Ecko - UNLABEL 
    3. Grant Cardone - The 10X Rule 

    I encourage you to read them all to help with finding your voice, success, failure, branding, sales, marketing, social media, motivaton, entrepreneurship and business start-ups.