Friday, May 08, 2015

The Shifting Role of System Integrators

Automation vendors are building more strategic relationships with integrators in an effort to offer holistic—and proactive—services.

The energy level in Washington, D.C., rose last week as motivated members of the Control System Integrators Association (CSIA) attended the CSIA 2015 Executive Conference which took place in our nation’s capital.
CSIA has over 500 members and provides them with a community of best practices, certifications, and working groups. The conference serves as a way to network with peers, meet potential partners, and learn about industry trends. And the system integrators in attendance appreciate the collaborative spirit of the forum.
“It is a foundation for establishing processes and best practices, and as a systems integrator, I always get great ideas at these meetings,” says Janet Campoverde, operations manager at Enterprise Automation in Irvine, Calif.One idea percolating throughout the partner exhibit was the changing role of the system integrator from a project-based partner to a strategic solutions provider who can help manufacturers with the Industrial Internet of Things (IIoT), system performance improvement, and predictive maintenance.
“The role of the system integrator is important as they are the liaison between the customer and the vendor and they are trusted to deliver a best-in-class solution,” says Benson Hougland, vice president of Opto 22. “A good integrator will recognize trends and apply the right technology.”
According to a survey that ARC Advisory Group recently conducted, manufacturers look to a reliable, certified system integrator to help them move from being reactive to predictive. To that end, system integrators need to evolve to meet new data-driven demands, and that may mean adopting tools for themselves that will position them as a manufacturing partner.
GE Intelligent Platforms was at the CSIA 2015 conference to talk with integrators about its Equipment Insight software, a cloud-hosted application for industrial data collection, analysis, and management. While geared toward OEMs, GE is testing the concept that system integrators would want to use the software as part of their own equipment maintenance service. “A system integrator could act as an OEM of an overall system vs. servicing just a particular machine,” says Stephen Pavlosky, GE’s Equipment Insight leader. “It’s a proactive conversation with customers which is dramatically different than a reactive conversation.”
Similarly, Canary Labs, which has real-time data historian and trending analysis software, sees the role of the system integrator evolving to the point that they become more of a solution provider, which would require them to extract data from the plant in order to have more insight into what’s happening. Adopting a tool that would enable that deep level of visibility could change the way control system integrators work with customers. “It would be a big feather in their cap to be different,” says Don Mast, Canary Labs' information solutions consultant.
In addition, OEMs and automation suppliers recognize that system integrators are already working closely with end users and could be strategic partners as they leverage the Internet of Things and machine-to-machine (M2M) technology to build out new types of services.
“Integrators are important to us as they provide the first line of defense for technology support and are highly engaged with the end users,” says Tim Beckel, ABB’s regional channel manager for process automation control technologies.Thomas Schaefer, Rockwell Automation’s global industry manager for water/wastewater echoed that thought, noting that the company is building out expertise and partnerships in a very focused manner. “Integrators are already strategic to our business. And now we are taking a more vertical approach to industries and integrators.”

Friday, October 24, 2014

Before Things Go Out of Control

Solutions Consultant, Don Mast, talks about providing more safety in oil refineries across the country. One way to do this is through the tools Canary Labs provides:

With increased focus on the booming oil and gas industry by the media, the government, groups against anything gas, and health & safety industrial organizations, more needs to be done to prevent accidents upstream, midstream and downstream. 
Refineries convert raw materials into usable product.
Fact: Workers at American oil refineries die on the job about three times as often as their counterparts in other countries.  One example of this is Tesoro, a Washington state refinery that exploded in 2010, killing seven people.  Recently the U.S. Chemical Safety board who investigates our nation’s worst industrial accidents says, that on average, across the nation there’s a significant accident at an oil refinery once every three days.  Companies’ report all the time that they are missing data or chunks of time and when that data is gone, bad things can happen including injuries, downtime and massive fines.  With all the technology available, this can be reduced and prevented.  The answer is data.
Canary Labs, a leader in information solutions for the oil and gas industry, is working with global oil and gas companies, pipelines, drillers, and refining companies to protect workers and equipment, reduce production and exploration costs, improve process and production metrics while providing the data needed to drive crucial decisions before things at the job site go out of control.
Utilizing the latest technology, Canary Labs open, flexible and high performance enterprise software runs around the clock to monitor equipment and process metrics by collecting data via the Canary Historian and presenting both real-time and historical data trends, allowing operators to have a good understanding of the current plant / job site conditions.   Similarly, Canary Labs software tools provide operators access to invisible variables that are not easily detected; providing early warning alerts to problems that can be resolved before they become a disaster.  Solutions are deployed as a simple single site solution or as a complex distributed enterprise resource. 
Driving innovation in the data industry, Canary Labs technology allows operators to collect data from distributed or remote sites and safeguards that the data are sent to the historian in event of a communication outage.  Furthermore Canary’s Axiom trending solution, AxiomTrend, is an intuitive and easy to use data analysis tool. It visually transforms process data into knowledge. It empowers operators, engineers, and managers to stay connected while using a common analytical tool for viewing data anywhere at any time via mobile devices.  With increased scrutiny on the industry by governments and anti-gas groups, Canary Labs provides streamlined detailed reporting for regulatory compliance. 
“Canary Trending is an invaluable tool in our refinery. It runs around the clock, 24/7. Over the last 14+ years, Canary has provided quality products that utilize the latest technology.” Kevin Moran, System Engineer, Delaware City Refinery “As a trending tool, we like the flexibility in how it’s configured and the ease of use. Users can drag and drop to add trends to the chart and save charts for later recall. The speed is a big plus, allowing users to quickly scan and find historical data they want to see. Users can access overview charts and then quickly drill down to see detailed data. The user doesn’t have to be a genius to figure things out.”
Randy Walker, Control Systems Engineer says, “Canary is a valuable tool allowing us to graphically review archived data for maintenance issues and performance. Templates can be saved of commonly reviewed trends for quick future access. The export utility is used to generate viewed trends into reports for distribution. This trending software has proven to be a valuable asset in our day to day operations.” 
For nearly 30 years, Canary Labs has been a game-changer in the oil and gas, industrial automation, energy production and distribution sectors delivering world-class, real-time data historian and trending tools with a focus on safety and solutions.  

Friday, September 05, 2014


8 Reasons Big Data Projects Fail


Most companies remain on the big data sidelines too long, then fail. An iterative, start-small approach can help you avoid common pitfalls.
Big data is all the rage, and many organizations are hell bent on putting their data to use. Despite the big data hype, however, 92% of organizations are still stuck in neutral, either planning to get started "some day" or avoiding big data projects altogether. For those that do kick off big data projects, most fail, and frequently for the same reasons.
It doesn't have to be this way.
[Want more big data advice on staffing? Read Data Scientists: Stop Searching, Start Grooming. ]
The key to big data success is to take an iterative approach that relies on existing employees to start small and learn by failing early and often.
Herd mentality
Big data is a big deal. According to Gartner, 64% of organizations surveyed in 2013 had already purchased or were planning to invest in big data systems, compared with 58% of those surveyed in 2012. More and more companies are diving into their data, trying to put it to use to minimize customer churn, analyze financial risk, and improve the customer experience.
Of that 64%, 30% have already invested in big data technology, 19% plan to invest within the next year, and another 15% plan to invest within two years. Less than 8% of Gartner's 720 respondents, however, have actually deployed big data technology.
That's bad, but the reason for the failure to launch is worse: Most companies simply don't know what they're doing when it comes to big data.
It's no wonder that so many companies are spending a small fortune to recruit and hire data scientists, with salaries currently averaging $123,000.
8 ways to fail
Because so many organizations are flying blind with their data, they stumble in predictable ways (including thinking that a data scientist will magically solve all their problems, but more on that below). Gartner's Svetlana Sicular has catalogued eight common causes of big data project failures, including:

Management resistance. Despite what data might tell us, Fortune Knowledge Group found that 62% of business leaders said they tend to trust their gut, and 61% said real-world insight tops hard analytics when making decisions.

Selecting the wrong uses. Companies either start with an overly ambitious project that they're not yet ready to tackle, or they attempt to solve big data problems using traditional data technologies. In either case, failure is the usual results.

Asking the wrong questions. Data science is a complex blend of domain knowledge (the deep understanding of banking, retail, or another industry); math and statistics expertise; and programming skills. Too many organizations hire data scientists who might be math and programming geniuses but who lack the most important component: domain knowledge. Sicular is right when she advises that it's best to look for data scientists from within, as "learning Hadoop is easier than learning the business."

Lacking the right skills. This one is closely related to "asking the wrong questions." Too many big data projects stall or fail due to the insufficient skills of those involved. Usually the people involved come from IT -- and those are not the people most qualified to ask the right questions of the data.

Unanticipated problems beyond big data technology. Analyzing data is just one component of a big data project. Being able to access and process the data is critical, but that can be thwarted by such things as network congestion, training of personnel, and more.

Disagreement on enterprise strategy. Big data projects succeed when they're not really isolated "projects" at all but rather core to how a company uses its data. The problem is exacerbated if different groups value cloud or other strategic priorities more highly than big data.

Big data silos. Big data vendors are fond of talking about "data lakes" and "data hubs," but the reality is that many businesses attempt to build the equivalent of data puddles, with sharp boundaries between the marketing data puddle, the manufacturing data puddle, and so on. Big data is more valuable to an organization if the walls between groups come down and their data flows together. Politics or policies often stymie this promise.

Problem avoidance. Sometimes we know or suspect the data will require us to take action that we don't really want to do, like the pharmaceutical industry not running sentiment analysis because it wants to avoid the subsequent legal obligation to report adverse side effects to the U.S. Food and Drug Administration.
Throughout this list, one common theme emerges: As much as we might want to focus on data, people keep getting in the way. As much as we might want to be ruled by data, people ultimately rule the big data process, including making the initial decisions as to which data to collect and keep, and which questions to ask of it.
Innovate by iterating
Because so many organizations seem hamstrung in their attempts to start a big data project, coupled with the likelihood that most big data projects will fail, it's imperative to take an iterative approach to big data. Rather than starting with a hefty payment to a consultant or vendor, organizations should look for ways to set their own employees free to experiment with data.
A "start small, fail fast" approach is made possible, in part, by the fact that nearly all significant big data technology is open source. What's more, many platforms are immediately and affordably accessible as cloud services, further lowering the bar to trial-and-error.
Big data is all about asking the right questions, which is why it's so important to rely on existing employees. But even with superior domain knowledge, organizations still will fail to collect the right data and they'll fail to ask pertinent questions at the start. Such failures should be expected and accepted.
The key is to use flexible, open-data infrastructure that allows an organization's employees to continually tweak their approach until their efforts bear real fruit. In this way, organizations can eliminate the fear and iterate toward effective use of big data.
When selecting servers to support analytics, consider data center capacity, storage, and computational intensity. Get the new Hadoop Hardware: One Size Doesn't Fit All issue of InformationWeek Tech Digest today (free registration required).
Matt Asay is Vice President of Community at MongoDB. He was previously VP of Business Development at Nodeable. You can reach him at mjasay@mac.com and follow him on Twitter @mjasay. View Full Bio

Wednesday, January 22, 2014

Whatever it Takes Reality TV Show Sizzle Reel with Grant Cardone



Grant Cardone also known for his business advice and reality tv show TurnAround king has produced and created a new reality show Whatever It Takes starring his wife Elena Cardone and his staff.

Cardone known for being for his uncoventional methodologies is using reality TV to hire employees. Grant Cardone is a NY Times Best selling author, international sales and business expert who writes for Entrepreneur Magazine, Business Insider and Wells Fargo. 

This show is a combination of Fear Factor, Undercover Boss, Apprentice, Punked The Profit and TurnAround King with a splash of Simon Cowell and Gordon Ramsey with the only difference that Cardone is giving real people real jobs.

Did you find me? Can't wait to watch! Blessed to be part of it! Fantastic experience. 

Must Read Article: Brand And Marketing Trends For 2014



By Robert Passikoff, Forbes Magazine


It was management consultant, Peter Drucker, who advised the best way to predict the future was to create it. Creating new things being difficult, the next best way is to have access to validated and predictive loyalty and emotional engagement metrics to help point the way. Happily, we do. And after examining over 100,000 consumer assessments, we’ve identified 14 critical trends to help marketers create their own, successful futures next year...

    Friday, January 03, 2014

    Great Business Reads To Get 2014 Started With A "BOOM!"


    1. Gary Vaynerchuk - Jab, Jab, Jab RIGHT HOOK 
    2. Marc Ecko - UNLABEL 
    3. Grant Cardone - The 10X Rule 

    I encourage you to read them all to help with finding your voice, success, failure, branding, sales, marketing, social media, motivaton, entrepreneurship and business start-ups.


    Thursday, October 31, 2013

    WHATEVER IT TAKES FILMING..


    My team during the filming of the cable reality TV show "Whatever It Takes!"

    Monday, September 23, 2013

    Costs of Billboard Advertising


    By Gaebler.com Staff Writer
    September 13, 2013

    How much does billboard advertising cost? Are the costs of billboard advertising worth the money? Using billboards to advertise your products and services might be a smart move, and billboard prices might be less than you think.

    Billboard advertising can be an effective and cost-efficient way for entrepreneurs to spread the word about their products and services. The Outdoor Advertising Association of America estimates that U.S. businesses spent more than $5.5 billion on outdoor advertising in 2003 and is anticipating a healthy increase in 2004. No matter how you slice it, billboard advertising is on the rise in America.

    There are a number of reasons for the recent surge in billboard advertising, not the least of which is cost efficiency. Compared to other forms of advertising, billboards are a relatively inexpensive way to get your point across to the general public.

    Consider this: A newspaper ad is only good for a day and a television commercial only lasts about thirty seconds. But a billboard ad is working for you twenty-four hours a day, seven days a week.

    The cost of billboard advertising ranges from about $700 to $2,500 a month. At that rate, ten billboards could run as much as $25,000 per month. That sounds like a lot of money, until you realize that a full-page ad running for one day in a major newspaper costs about the same.
    Advances in technology have also contributed to billboard advertising's cost efficiency. In the past, billboards had to be hand-painted - a time-consuming and costly venture. But with today's computer technology, billboards are designed on a computer screen, printed to vinyl or poster paper, and glued to the billboard structure. The result: Higher quality ads in less time for less money.

    Before you jump into billboard advertising for your business, there are a few things you need to understand.

    1. The amount of information contained in a billboard ad is limited. If you expect your billboard to convey as much information as a print ad - forget it. It's just not possible. Keep your ads short and catchy. When it comes to billboards think more visuals, fewer words.

    2. Billboards are effective, but they do have their limitations. For that reason, (and others), smart business owners view billboard advertising as one part of a balanced marketing strategy. An integrated marketing strategy involving print, broadcast media, and billboards is key for attracting and retaining new customers.

    3. Know your market. Since the majority of people who own automobiles are typically more affluent and mobile, billboard ads tend to target middle- to upper-income demographics. It also pays to be aware of the traffic patterns of your target customer base. This will be invaluable in helping you find the right placement for your business' billboard ads.

    For more information on billboard advertising in Pennsylvania, Call "The Billboard Guy", Don Mast (814) 660-2012 or email mast@84outdoor.com .