Sales Force Marketing For Big Data Solutions

Big Data companies to watch

These big data companies are ones to watch

Daly and Newton/Getty Images—OJO Images RFWhich companies are breaking new ground with big data technology? We ask 10 industry experts.

It’s hard enough staying on top of the latest developments in the technology industry. That’s doubly true in the fast-growing area known as big data, with new companies, products and services popping up practically every day.

There are scores of promising big data companies, butFortune sought to cut through the noise and reached out to a number of luminaries in the field to ask which big data companies they believe have the biggest potential. Which players are really the ones to watch?

That question, we learned, is rather difficult to answer.

“A list of ‘big data companies’ is interesting because of the definition,” said Dean Abbott, co-founder and chief data scientist of Smarter Remarketer. “Is a ‘big data’ company one that is pure-play analytics with big data? A company that does predictive analytics often with big data but not always—like Beyond the Arc or Elder Research? A large company that has a group that usually does ‘big data analytics’ but it’s only a small part of what they do as a company—like HP  HP 0.19% , SAP, Dell, even GE  GE -0.02% ?”

‘One of the most interesting ones I’ve seen’

There was certainly consensus on some of the big data companies that industry experts said were notable. At least two of 10 experts we polled named MapRMemSQLDatabricks,PlatforaSplunkTeradataPalantirPremiseDatameer,ClouderaHortonworksMongoDB, and Trifacta as shining examples in the space.

MemSQL, for example, is “an in-memory relational database that would be effective for mixed workloads and for analytics,” said Svetlana Sicular, an analyst at Gartner. “While SAP draws so much attention to in-memory databases by marketing their Hana database, MemSQL seems to be a less expensive and more agile solution in this space.”

Splunk, meanwhile, “has an excellent technology, and it was among the first big data companies to go public,” Sicular said. “Now, Splunk also has a strong product called Hunk—Splunk on Hadoop—directly delivering big data solutions that are more mature than most products in the market. Hunk is easy to use compared to many big data products, and generally, most customers I spoke with expressed their love to Splunk without any soliciting on my side.”

Palantir Technologies, which focuses on data analysis for public sector clients, also received high marks. “I’d have to put Palantir at the top of the list” of the startups in the big data space, said Tom Davenport, an IT management professor at Babson College and author of the book, Big Data @ Work.

DJ Patil, vice president of product at RelateIQ, said San Francisco-based Trifacta—which makes a “data transformation platform” promising increase productivity for data analysts—was a company to watch. “One of the most interesting ones I’ve seen,” said Patil, who serves as a technical advisor to the company.

Two miles away, cross-town peer Datameer is also remarkable, said Carla Gentry, founder of Analytical-Solution. “There are lots more companies out there, but most of the them just end up being a BI platform that anyone with an engineer could have started.” Datameer is different, she said.

‘Graphs have a great future’

Some of the less well-known companies received the highest praise.

Tamr, for instance, is “an exciting startup in data curation, so that would be my nomination,” said Gregory Piatetsky-Shapiro, president and editor of KDnuggets.com.

Neo Technology, the company behind open source graph database Neo4j, is another that Gartner’s Sicular pointed out. “I think graphs have a great future since they show data in its connections rather than a traditional atomic view,” she said. “Graph technologies are mostly unexplored by the enterprises but they are the solution that can deliver truly new insights from data.” (She also named PivotalThe Hive andConcurrent.)

DataStaxWibiDataAerospikeAyasdi and ClearStory were all part of analyst Curt Monash‘s “obvious inclusion” list, he said, while AutomaticPlanet LabsSight MachineDataPad,InteranaWise.ioLendUpDeclaraSentinel LabsFliptopSift ScienceImport.io and Segment.io were among those named by data scientist Pete Skomoroch.

Paxata and Informatica were both cited by Ovum analyst Tony Baer; IBM  IBM 0.18% SyntasaActian and Tableau were four named by George Mason University professor and data scientist Kirk Borne.

“There are a number of startups in security and machine learning that are emerging,” Baer said. “What’s missing right now are startups that look at data governance, stewardship, lifecycle management for big data. Right now IBM is largely alone, but I’m expecting there will be more startup action to come.”

‘Most of these companies will go away’

If you’ve reached this point in the article, you will have read 42 recommendations by our panel of experts. All of them are foremost technology companies; most exist specifically to perpetuate big data technology.

But some experts said that the most interesting big data companies aren’t big data companies at all. Established companies with traditional products and services are starting to develop offerings based on big data, Davenport said. Those include agriculture giant Monsanto  MOO , back-office operations stalwart Intuit  INTU -0.16% , and the trucking company Schneider.

“To me, it’s much easier to create a startup than it is to change your entire way of thinking about information and how it relates to your business operation,” Davenport said. “One of the really exciting things about big data is when you use it to create new products and services.”

He added with hesitation: “It’s early days still, and we don’t know how easy it will be for companies to make money off these things.”

It is inevitable that there will eventually be a thinning of the big data herd, experts said.

“Most of these companies will go away because the most important part of the big data movement will be how to use data operationally—to make decisions for the business,” Smarter Remarketer’s Abbott said, “rather than who can merely crunch more data faster.”

Accenture Revenue Analytics and Business Intelligence

Revenue Analytics and Business Intelligence

Overview

Why Accenture

Specific Services

Revenue agencies face an environment of electronic filing, reduced resources, new taxpayer expectations and complex fraud schemes.

  • How can revenue agencies use data to improve risk management practices?
  • How can revenue agencies use data to better understand taxpayer behaviors?
  • How can revenue agencies improve debt collection, returns processing and audit activities?

More and more, leading revenue agencies are capturing significant amounts of data from taxpayers and third party providers. Many are asking questions like these and want to better analyze this data to understand taxpayer behaviors, improve fraud detection and risk management and enhance compliance enforcement.

Some are looking to business intelligence and predictive analytics to create new value from data. They are using quantitative methods to derive actionable insights and outcomes from data to meet these objectives and improve agency functions. Standard practice in the private sector, analytics can help revenue agencies achieve high performance by:

  • Generating more revenue.
  • Enhancing compliance via targeted compliance enforcement activities.
  • Improving audit, collections and returns processing procedures.
  • Optimizing use of personnel and resources.
  • Improving and streamlining taxpayer service.
  • Customizing service delivery channels based on taxpayer preference.
  • Improving operational visibility.
  • Understanding voluntary compliance.
  • Enabling faster and better decision making.
  • Optimizing the return on existing business and technology investments.
  • Reducing tax fraud.

Big Data Today

Incredible Ways Big Data Is Made use today’s technology

 

143891-1

 

 

 

 

 

 

 

 

 

The term ‘Big Information’ is as massive in management and technology as Justin Bieber and Miley Cyrus are in music. Like with various other mega buzzwords, lots of claim huge information is all talk and no activity. This couldn’t be further from the fact. With this post, I would like to demonstrate how huge data is made use of today to include actual worth.

Eventually, every aspect of our lives will be affected by large information. Nonetheless, there are some locations where big data is currently making a real difference today. I have classified the application of big data into 10 areas where I view one of the most extensive use along with the highest perks [For those of you which wish to take a go back below and comprehend, in basic terms, just what huge information is, check out the posts in my Big Information Guru column]

1. Understanding and Targeting Consumers
This is just one of the most significant and most advertised locations of huge information use today. Right here, huge data is utilized to far better know consumers and their habits and preferences. Firms are keen to broaden their standard data collections with social media data, web browser logs as well as text analytics and sensing unit information to get a much more total image of their clients. The big objective, in many cases, is to create anticipating versions. You could remember the instance of UNITED STATE retailer Target, who is now able to really properly forecast when one of their clients will certainly expect a child. Making use of huge information, Telecommunications companies can now better anticipate client spin; Wal-Mart can forecast just what items will certainly offer, and car insurance coverage firms know how well their consumers in fact drive. Also federal government election campaigns could be optimized making use of large information analytics. Some think, Obama’s succeed after the 2012 governmental election campaign was because of his group’s remarkable capability to utilize huge information analytics.

2. Understanding and Optimizing Business Processes
Huge data is also increasingly utilized to maximize company processes. Stores manage to optimize their stock based on forecasts produced from social networks information, web search trends and weather report. One certain company process that is seeing a great deal of huge data analytics is supply chain or delivery route optimization. Right here, geographical positioning and radio frequency recognition sensors are utilized to track goods or shipping cars and maximize routes by integrating real-time web traffic information, and so on. Human Resources company procedures are also being improved utilizing large data analytics. This consists of the optimization of talent purchase Moneyball design, in addition to the measurement of business society and team involvement making use of big data devices.
3. Individual Metrology and Efficiency Optimization

Large information is not merely for companies and governments however additionally for all of us independently. We could now take advantage of the data created from wearable tools such as smart watches or smart bracelets. Take the Up band from Jawbone as an instance: the armband accumulates data on our calorie usage, activity levels, and our sleep patterns. While it offers people rich ideas, the genuine value is in evaluating the collective information. In Jawbone’s case, the company now collects 60 years worth of rest data every night. Evaluating such quantities of information will certainly bring totally new insights that it could feed back to specific users. The many others location where we take advantage of big information analytics is finding love – online this is. A lot of on the internet dating sites apply huge information tools and algorithms to locate us the most proper matches.
4. Improving Health care and Hygienics

The computer power of huge information analytics enables us to decode entire DNA strands in minutes and will allow us to find brand-new cures and much better know and predict condition designs. Simply consider just what happens when all the individual information from wise watches and wearable devices can be used to apply it to countless people and their various diseases. The professional tests of the future will not be restricted by small example sizes however can potentially consist of everybody! Big information techniques are currently being utilized to monitor infants in an expert early and sick baby system. By taping and evaluating every heart beat and breathing design of every infant, the system managed to create algorithms that can now forecast infections 24 hours prior to any sort of bodily signs show up. By doing this, the group can step in early and save delicate babies in an atmosphere where every hour counts. Exactly what’s more, large information analytics enable us to keep an eye on and forecast the advancements of upsurges and disease break outs. Integrating information from clinical documents with social networks analytics enables us to keep an eye on flu episodes in real-time, just by listening to what folks are saying, i.e. “Feeling rubbish today – in bed with a cold”.

5. Improving Sports Efficiency
Most exclusive sports have actually now embraced big information analytics. We have the IBM SlamTracker tool for tennis competitions; we use video analytics that track the performance of every player in a soccer or baseball video game, and sensor innovation in sporting activities devices such as container balls or golf clubs permits us to obtain comments (using cell phones and cloud servers) on our game and the best ways to enhance it. Lots of exclusive sports groups likewise track sportsmens away from the sporting environment using wise innovation to track nutrition and rest, and also social media discussions to monitor psychological health and wellbeing.

6. Improving Science and Studio
Science and studio is currently being transformed by the new opportunities huge data brings. Take, for example, CERN, the Swiss nuclear physics lab with its Huge Hadron Collider, the globe’s largest and most highly effective particle accelerator. Experiments to open the tricks of our cosmos just how it began and functions – generate huge amounts of information. The CERN data center has 65,000 processors to analyze its 30 petabytes of data. However, it makes use of the computer powers of hundreds of computer systems distributed across 150 data centers worldwide to evaluate the information. Such computer powers can be leveraged to transform so many various other areas of science and medical.

7. Optimizing Machine and Gadget Efficiency
Large information analytics help equipments and gadgets become smarter and more independent. For example, huge data devices are utilized to operate Google’s self-driving auto. The Toyota Prius is matched with electronic cameras, GPS in addition to powerful computers and sensors to safely drive when driving without the intervention of human beings. Big information tools are also made use of to maximize energy grids making use of information from smart meters. We could even use large information tools to enhance the performance of computers and information storage facilities.

8. Improving Security and Police.
Big information is used greatly in boosting protection and enabling police. I make certain you recognize the revelations that the National Security Firm (NSA) in the UNITED STATE uses huge data analytics to foil terrorist plots (and perhaps spy on us). Others make use of big data techniques to find and prevent online assaults. Polices make use of large information tools to capture bad guys or even forecast criminal activity and bank card companies use large data use it to identify deceptive purchases.

9. Improving and Optimizing Cities and Countries
Big information is used to boost lots of aspects of our cities and nations. As an example, it permits cities to maximize visitor traffic flows based on real time web traffic information along with social networking sites and weather data. A number of cities are presently piloting huge data analytics with the purpose of turning themselves into Smart Cities, where the transportation infrastructure and utility processes are all joined up. Where a bus would await a delayed train and where traffic indicates forecast traffic quantities and operate to decrease jams.

10. Financial Trading
My final classification of huge data application originates from monetary trading. High-Frequency Trading (HFT) is a location where large information discovers a bunch of use today. Below, big data algorithms are made use of to make trading choices. Today, the majority of equity trading now happens by means of information formulas that significantly think about signals from social networks networks and news web sites to make, buy and sell decisions in split secs.
For me, the 10 categories I have outlined right here stand for the locations where huge data is used one of the most. Certainly there are many various other applications of big information and there will certainly be lots of new classifications as the tools come to be more widespread.

Big Data

 

Big Push into Big Data

Midsize Companies Plan Big Push into Big Data

Ann All
Ann All |   ARTICLES   |   POSTED 28 APR, 2014

     Print     |     Email     |     Share   

Big Data is not just within the purview of big companies. That’s the takeaway of new research from Competitive Edge Research Reports. The firm’s study, which was commissioned by Dell, found that 41 percent of midsize companies are already involved in one or more Big Data projects while 55 percent are planning initiatives.

Midsize companies that have moved from planning into implementing Big Data projects are reporting successes, the research found.

About half of respondents with existing initiatives gave themselves high marks in six strategic tasks: improving speed and quality of decision-making, improving product quality, identifying and taking advantage of business opportunities, understanding constituent sentiment, understanding customer needs and predicting future trends that may impair achievement of business goals. In contrast, fewer than 30 percent of respondents that were still just planning Big Data projects scored themselves well in most of these categories.

The gap was greatest in the improved decision-making category, in which 50 percent of companies with Big Data in production said they performed “very well” in this area, vs. 23 percent of companies that are still in the planning stage.

Big Data Spending

Buoyed by these early wins, respondents expect to boost their Big Data budgets by at least $2 million over the next two years. The research firm predicts the average Big Data budget will reach $6 million during this time frame.

“The early success midmarket companies are seeing with their Big Data initiatives will encourage more growth and investment, and additional returns on that investment will be achieved as they dive further into different datasets and embrace ever-improving analytic capabilities,” said Darin Bartik, executive director, product management, information management, Dell Software, in a statement.

The study offers insight into the tools that will likely win be included in Big Data budgets. The three tools ranked as most valuable were: real-time processing of data and analytics, predictive analytics and data visualization, to convert processed data into actionable insights.

Big Data Challenges, Best Practices

While the study found many positive trends, some daunting Big Data challenges still remain. Named most frequently as challenges: wide variety of new data types and structures, cited by 40 percent of respondents; sheer volume of data slows processing, mentioned by 34 percent; and budget limitations to improve data analysis capabilities, 32 percent.

Perhaps the most valuable insights in the study are the factors respondents named as key to their Big Data success. Forty-one percent of them tapped strong collaboration between business units and IT organizations as a success factor. Similarly, IDC Retail Insights recently noted that companies with more advanced Big Data capabilities were far more likely than their less mature peers to involve both business and IT in their Big Data projects.

4 Tips on Choosing a Supply Chain Planning and Optimization (SCP&O) Solution

IDC Retail Insights recommends that companies establish a collaborative governance structure involving lines of business, IT and a separate analytics group. With Big Data initiatives, high achievers tend to put IT in a leadership role, a finding that runs counter to the common practice of having lines of business lead technology-driven business initiatives, IDC Retail Insights found. Under such models, IT is responsible for overall strategy, planning and application development, with business responsible for evaluating the capabilities created by IT and the ultimate business outcomes. The analytics group handles management of data, content and analytics.

The Dell-commissioned research recognizes “encouraging signs of shared responsibilities taking shape among the management ranks.” For example, while 76 percent of respondents said IT was the most responsible for implementing Big Data projects, 56 percent mentioned sales management in a leadership role. This emphasis on customer-facing issues reinforces a need for close collaboration, the report notes.

Ann All is the editor of Enterprise Apps Today and eSecurity Planet. She has covered business and technology for more than a decade, writing about everything from business intelligence to virtualization.

 

IBM’s Multi-Billion-Dollar Cloud/Big Data

Looking Far Ahead: IBM’s Multi-Billion-Dollar Cloud/Big Data Analytics Strategic Initiative

Rob Enderle
Rob Enderle |   UNFILTERED OPINION   |   POSTED 11 JUL, 2014

     Print     |     Email     |     Share   

Slide ShowThe Differences Between Hardware Design and Software Development

Yesterday, IBM announced its massive $3 billion investment effort to create a processor uniquely suited to the demands of the cloud and Big Data systems. This has both strategic and timing implications to the market. Efforts like this speak to the level of maturity in the market as well as IBM’s commitment to leading it. The announcement anticipates a massive improvement in processor scaling down to 7 nanometers and begins to flesh out what the Big Data world will look like in 2020.

 

Why Do You Care About 2020?

Often the mistake that both technology companies and IT managers make is focusing too much on the tactical and working to solve the problems of today. This tactical focus generally puts everyone in firefighting or whack-a-mole mode, constantly on the verge, and sometimes over it, of being overwhelmed by the changes they can barely keep up with.

The reason to maintain at least a five-year view is so that strategic efforts like Big Data analytics, which consume massive resources, aren’t prematurely obsolete and you can anticipate the world that will exist once they have matured. This eye on the future is often what distinguishes firms that survive decades from those that don’t make it to their first 10-year anniversary. The surviving firms have anticipated and prepared for changes.

What the IBM Announcement Means

This announcement means the market has reached the point where large solutions providers are beginning to build solutions from the ground up, not cobble together technologies that were designed for other things into a kludge that sort of works. The data analytics and cloud solutions increasingly demand a level of performance and granularity that wasn’t imagined when current chip technologies were incubated, and big vendors like IBM now understand what current technologies don’t do. Thus, they have a roadmap to create something that works better.

 

5 Tips for Better S&OP Scenario Planning

They are now looking beyond 7nanometer technology into quantum computingcarbon nanotubes and neurosynaptic computing. The first two are being worked on by a number of vendors now and promise, but have yet to deliver, massive improvements in processing speed and levels of security well beyond what we have today. Neurosynaptic computing promises a near-human-like capability to anticipate and proactively respond to future events by systems, taking products like IBM’s Watson to a future where they largely train themselves and can actually figure out more quickly the answers to questions that users haven’t even thought to ask. Carbon nanotubes (CNTs), in particular, could help define the smartphones and personal devices of the future. 

For those next-generation devices, which will be accessing these increasingly intelligent Watson-like back-end systems (think of Siri or more likely Microsoft’s Cortana on steroids), you’ll need advanced, low-powered transistors that are far more efficient. Many of those devices will be wearable, and to keep heat and power costs down to manageable levels, the data centers of tomorrow will find CNTs and Tunnel Field Effect Transistors (TFETs) critical. Finally, graphene will be critical to the future, as silicon runs out of performance headroom and a more efficient and powerful alternative is increasingly required.

They are layering on advancements in silicon photonics, which massively improves the speed of data transport because Big Data cloud-hosted jobs tend to be very fluid with regard to where they are located and have to be moved depending on their relative importance to the company and the proximity of the user.

Wrapping Up: How IBM Lasts

IBM’s announcement both showcases that the cloud and Big Data analytics requirements of today can now be projected into the future and that it is turning its massive R&D engine into creating the technologies that future will require. This is a massive effort by IBM, showcasing the kind of commitment that made it the longest-lasting technology vendor in the market. The firm is a survivor largely because it is able to step outside of the day-to-day tactical concerns and invest in the world of tomorrow so it has a place in it. That’s a decent example that more firms in every segment should follow.

 

Without the cloud, Microsoft might lose grasp on the enterprise

Without the cloud, Microsoft might lose grasp on the enterprise

 

zz-pride

 

 

 

 

Microsoft Aims to partners to assist persuade ventures to switch over from Microsoft software program to its cloud support services

Computerworld – Microsoft’s renewed push into the cloud computer market, revealed this week, doesn’t transform the truth that it deals with unmatched competition in a business that’s vital to its future.

“This will certainly be the hardest work Microsoft has ever dealt with,” said Jeff Kagan, an independent market expert. “It’s all about the Microsoft cloud. They have to be successful. Unlike the past, where Microsoft truly had little competition, in the cloud globe they deal with lots of competition, so there are no assurances.”.

Microsoft’s Chief Operating Policeman Kevin Turner, talking at the Worldwide Companion Conference today, advised the firm’s stations companions to begin pushing the firm’s cloud innovations – hard.

Microsoft made its fame and fortune by marketing COMPUTER operating systems and on-premise company software– to the factor that Windows and Workplace have actually long been common business tools. To keep its worldwide position, Microsoft needs to quickly make its method into cloud computer, which is coming to be increaingly well-liked among small and big businesses looking to save cash and IT sources.

Cloud watch.
Enterprises significantly planning to the private cloud.
Without the cloud, Microsoft might lose grasp on the business.
Just how the cloud could make IT shops much more ingenious.
Company customers bypass IT and go fake to the cloud.
HP wants to alleviate business IT shadow worries.
Worried of the cloud? Ways to handle your fears.
5 reasons Google could capture Amazon.com in the cloud.
Public cloud market all set for ‘hypergrowth’ duration.
Cloud safety worries are pompous, specialists say.
Cloud computing 2014: Moving to a zero-trust safety model.
A lot more in Cloud Computer.
If Microsoft does not convince its companions that can help out, companies have a lot of all other cloud options, consisting of Google’s Application suite.

Turner said to the celebration that Microsoft’s cloud services– Workplace 365, Azure, Characteristics CRM Online and various other tools– are expanding robustly. Nevertheless, he recognized that Microsoft and its partners need to get the pace.

Microsoft has actually counted greatly on its around 400,000 partners, which range from resellers to device integrators, to market its operating systems and applications for the past Twenty Years approximately. Now the firm needs to guide this huge ship in a brand-new direction.

Robert Mahowald, an analyst with IDC, said the firm needs to persuade its partners that there’s money to be made by pushing Microsoft’s cloud offerings.

“Microsoft acknowledges that this is going to have a massive impact on its companions,” he mentioned to Computerworld. “They have to reveal that their partners could earn money and succeed with this. Microsoft’s success is their companions’ success and the other way around.”.

“Cloud is essentially the brand-new system for Microsoft,” said Mahowald. “I believe it’s more important than mobile, large information or social. What they had in Windows, they need to duplicate [in the cloud] or they shed the franchise. If their aged system doesn’t matter anymore, then Microsoft has lost the software lock-in that is their crown jewel.”.

“If clients aim to the cloud and think they do not require Microsoft, then [the business] loses its hold,” added Mahowald. “Microsoft has a whole lot cycling on making this change easy for their customers.”.

Jagdish Rebello, an analyst with IHS, said that Microsoft will significantly make cloud support services and the Internet of Points the cornerstones of its lasting company technique.

“I think Microsoft is a strong number 3 behind Amazon and Google in the cloud market,” Rebello said. “Amazon is clearly the leading gamer and has the dominant market share, however in some markets – particularly the huge company market – Microsoft is trying to leverage its solid connections with IT and its considerable cloud providings. Unscientific details shows that they are doing well with some crucial clients.”.

If this helps Microsoft, it also could possibly be a benefit for business that have a lengthy record with Microsoft.

“As partners planning to take enterprises using legacy Microsoft remedies to the cloud, it creates a degree of confidence in venture IT minds that there is a long-lasting remedy for them,” stated Rebello. “This makes Microsoft’s solutions appealing or at least component of the numerous options that IT will certainly consider.”.

But Microsoft can not succeed without help from its companions, Rebello said. “Without a sturdy environment of partners, Microsoft will certainly not be a major player in this room.”

 

Without the cloud, Microsoft might lose grasp on the enterprise

HPQ vs. IBM:

TECH INVESTING

HPQ vs. IBM: Who Will Win This Clash of the Tech Titans?

Author Image for Michael A. Robinson

Today I’m refereeing a boxing match between two of the biggest tech legends around: International Business Machines Corp. (NYSE: IBM) and Hewlett Packard Co. (NYSE: HPQ).

I’m calling it the Clash of the Tech Titans.

The prize? A big slug of profits for your investment portfolio…

Both of these fighters are big – we’re talking market caps in the tens of billions – but these longtime blue chips are more black and blue right now… and are working through major corporate turnarounds. (In fact, both installed new chief executive officers less than three years ago.) They’re both trying to raise revenue and income in order to send their stock prices higher.

Now, neither of these heavyweights is a bad investment – both are solid companies, in it for the long haul.

But one of these pugs just might be a stud – more of an inside fighter – that you should add to your portfolio now.

Today, if you agree with my decision and make that investment, you’ll soon be watching your wealth grow fast…

The Tale of the Tape

IBMIn this corner, “Big Blue” traces its New York roots back more than 100 years. It literally pioneered the dawn of the computer age.

International Business Machines Corp. (NYSE: IBM) weighs in with a market cap of $186.13 billion, 2013 revenue of $99.8 billion, and net income of $18 billion. It’s got a price/earnings (P/E) ratio of 12.64 and a 2.38% dividend yield.

And in the other corner, the “Puncher from Palo Alto” was one of the very first firms to set up shop in what became known as Silicon Valley. Founded in 1939 by two graduates of Stanford University, the founding partners started up the company in the proverbial one-car garage.

HPQWeighing in with a market cap of $63.63 billion,Hewlett Packard Co.(NYSE: HPQ) reported $112.25 billion in 2013 revenue and a net income of $5.11 billion. Its P/E ratio is at 11.97, and its dividend yield stands at 1.88%.

However, appearances can be deceiving. Those sound like great numbers, but both of these fighters are in turnarounds.

Both have seen declining revenue, income, and stock price over the past few years, and I know that neither of these new CEOs is happy with her stock price.

And that’s why I’m refereeing this match. I’ve been a turnaround investor for almost as long as I’ve been around the high-tech world.

No doubt, turnarounds are one of my “special situations” that can hand investors huge profits… if you know what to look for.

Let’s ring the bell.

IBM dreams Watson

IBM dreams of a federal-friendly Watson

Share on Facebook16Share on Google+0Tweet about this on Twitter34

Steve Gold, vice president of IBM's Watson Group, showcases mobile uses of Watson. (Credit: IBM)

Steve Gold, vice president of IBM’s Watson Group, showcases mobile uses of Watson. (Credit: IBM)

For decades, IBM has worked on developing a cognitive computing system named Watson to give humans the answers they desire in a quick and contextual setting. And though it’s most famous for an appearance on Jeopardy competing against some of the show’s greatest contestants, the cognitive computer has an abundance of real world applications the company said could one day serve the federal government.

Whereas normal computers are programmed to process code, Watson is a machine that “can actually be taught and learn,” as well as perceive, reason and relate information in human language, said Neal Byrd, a member of the IBM Watson Solutions team formed earlier this year, in a webcast Tuesday.

To do this, IBM has been working on an iterative process since the ’90s expanding the knowledge base of the computer by training it just like a newborn baby. But whereas babies can uses five different senses to take in information, Byrd said, Watson is a ferocious data reader. Years back, that constant processing and storing of information meant a massive system, filling up an entire room. Now, it’s the size of three stacked pizza boxes and delivered via the cloud, according to Watson press representative.

 

IBM’s major public breakthrough came in 1997 when its Deep Blue computer, one of Watson’s ancestors, played and defeated chess champion Garry Kasparov. Rather then going through programmed, situational codes, Byrd said IBM taught the computer “how to compete like a human in that it had to sacrifice moves to win.”

But outside of competitive trivia and board games, Watson’s value to society will likely materialize in its ability to produce sought after information from several locales in the time it takes a human to process a thought. And for federal agencies, that means improved services both internally and externally.

During the Watson webcast, Byrd explained how the State Department could theoretically leverage Watson for citizen service self-help. For instance, a college student traveling abroad for college would probably have tons of questions in preparation for the trip. Instead of involving a department employee to answer all the questions or subjecting the student to searching through FAQs and Google for keywords, a trained Watson could source together answers to those questions in one location.

Elsewhere in the federal government, Byrd said Watson could impact health care for underrepresented groups.

“We think Watson is going to be ready to take the medical board exam within the next quarter or so, and we think it’s going to pass,” he said. “Now we’re going to have a machine that’s trained in the latest medical board information out there.” So, physicians in remote areas, away from hospitals, can access the “latest and greatest health information” from a computer or mobile device. Additionally, it can help minorities speaking foreign languages, starting soon with Spanish and then French.

At the Justice Department, Byrd foresees litigators using Watson to “look at precedents and laws on the book, as well as modeling — whether someone should even take a case and their chances of winning based on evidence and historical models.” And before it get to the courts, law enforcement can use an investigative framework to “find people who don’t want to be found.”

And dear to many in the federal IT community, Watson might be able to solve issues with federal procurement, which Byrd said is “on the top of mind for everybody with shrinking budgets.” By providing federal agencies a quick pathway to a wealth of information on the cheapest and most efficient seller, he said, they could avoid buying redundant or outdated technology and other equipment.

IBM dreams of a federal-friendly Watson

 

Content Marketing and Semantic Web Optimization

sales-marketing-alignment

5 Ways to Align Content Marketing and Semantic Web Optimization

 

  |content marketing and semantic search optimization practices in order to create a natural single-track strategy.

2014 has been heralded the year of content marketing. At the same time, we’re optimizing our search marketing practices for the semantic search environment. Together, there’s a need to merge the two different objectives into a unified strategy.

From a search marketing perspective, it makes sense to integrate content marketing and semantic search optimization practices. The introduction of Hummingbird has taught us to deploy search optimization strategies that contextualize queries. Digital marketing with content, on the other hand, is deployed to drive traffic and engage prospects. You can see where the two might combine to form a natural single-track strategy, right?

I’ve got five ways I believe content marketing and semantic search can be better aligned.

1. Google Authorship

It’s been talked about here on ClickZ, on the Adobe blog, on Moz, and on many search industry blogs. Google has stated that Author Rank is used as a ranking factor, especially for in-depth articles. Moreover, you can’t rank without content that is viewed, shared, mentioned, linked to, or otherwise distributed.

Google Authorship has been shown to significantly boost organic visibility, even for authors who exist in fewer than 100 Google+ circles. Authorship markup helps with ranking on Google but more importantly, leads to higher click-throughs. This will, in turn, support better organic visibility. By distributing contextualized content that can be indexed using snippets through a Google+ profile, your chances for high visibility are increased.

2. Sharable Content

Search marketing analysts have been implying for years that social signals will soon correlate to better search visibility. While Google has denied using social signals as ranking factors, there is still a correlation effect where content with higher social activity may also rank higher. Obviously, shared content is not about link building, but rather a strategy – using content that provides value – which seeks to have your content shared among social circles, authority sites, and others. As your subject matter authority (see #4 below) improves, search relevancy will likely improve as well.

3. Link Potential

I realize that, in many circles, link building is not an intentional technique most enterprise SEOs engage in regularly. Partly this is due to the spammy ways in which SEOs have acquired links, which are repugnant to search marketers who optimize in a natural way. Traditional link building also takes more action, effort, and commitment than a social sharing strategy. However, because links remain a high-impact ranking factor, it’s important to consider deploying semantic-centric marketing to build inbound links. In turn, this value will contribute to increasing your subject matter authority (see #4). Consequently, a robust content marketing campaign that is optimized for semantic search is more likely to yield page authority, thus greater search visibility, than simply deploying a social sharing campaign alone.

4. Subject Matter Authority

One intent of search engines is to provide searchers with the most likely sources of information for a specific query. In order to ascend to a position as a subject matter authority, your pages must reflect a combination of relevant information and popularity. The value you provide, when optimized for semantic search, can evoke higher search visibility because your site is recognized by engines as having domain authority, which is another way of saying you have become a subject matter expert, a “go-to” site for potential customers.

5. Structured Authoring

Many trends are pointing to structured authoring (SA) governing the future of mobile and local search. Every content marketing campaign should be developed with search results in mind IMHO. Therefore, content marketing that deploys structured authoring will be better positioned to have success in the space.

As background, I talked about structured authoring as the new normal in search optimization recently. Note, though, that SA has been noted by Bing and others to not affect ranking. However, because this siloing of data makes it easier for engines to index page data, structured authoring of information typically leads to greater visibility. The additional benefits include a consistent delivery of information and reuse of data and localization, both critical factors in the mobile search space.

Using XML to define attributes (called “resources”), we can assign a variety of subject, property, and object tags which, when used across multiple assets, produce a unified profile that searchers can approach through a diverse set of queries.

The content being distributed has to be specified (authored) with context in mind. Take the following example from schema.org that marks up a page for the movie Avatar:

< div itemscope itemtype =”http://schema.org/Movie”>
< h1 itemprop=”name”>Avatar < /h1>
< span>Director: < span itemprop=”director”>James Cameron < /span> (born August 16, 1954) < /span>
< span itemprop=”genre”>Science fiction < /span>
< a href=”../movies/avatar-theatrical-trailer.html” itemprop=”trailer”>Trailer < /a>
< /div>

Supporting content related to this entry should provide additional context to the spiders. Here is an example of how a contextualized, related asset might be structured:

< div itemscope itemtype =”http://schema.org/Movie”>
< h1 itemprop=”name”>Avatar < /h1>
< span>Actor: < span itemprop=”actor”>Sigourney Weaver < /span> (born October 8, 1949) < /span>
< span itemprop=”genre”>Science fiction < /span>
< a href=”../movies/avatar-theatrical-trailer.html” itemprop=”release date”>December 18, 2009 < /a>
< /div>

In this markup, we defined the actor Sigourney Weaver as an asset associated with the movie. You can see the variety of information that can be written into each snippet. There will usually be a robust set of resources that you can embed in each page. The more details you can provide in a structured author markup, the more likely your assets will be found in a semantic Web environment.

Rich Snippets

Content marketing for semantic web search success starts with creating rich snippets. Rich snippets are parcels of information displayed in various formats on search engine results pages. On-page markup embeds schemas, which provide standardized rules that engines use when crawling your pages. Look at the image below:

google-miami-beach

Notice there are four types of snippets in the image: reviews with image, local map, images, and traditional text snippets. Your content delivery approach should be structured in a way that leverages as many attributes within your content as possible. Meaning it’s best to structure your microdata, microformat, or RDFa formats to deliver one or more of the following resources that Google supports in every marketing asset you distribute:

  • Reviews
  • People
  • Products
  • Businesses and organizations
  • Recipes
  • Events
  • Music

Doing so ensures that, when indexing your pages, search engines can easily depict the details that searchers are looking for. To see how your structured data will appear in a SERP, use the Google Structured Data Testing Tool.

Disambiguation

Let me provide a word of caution here. When you optimize your content marketing for semantic search, avoid conflicts around ambiguity. Disambiguation is required when spiders get confused about the use of certain language, and that can lead to ineffective visibility (i.e. high bounce rates due to irrelevancy). For example, the term board can mean an organized body, a piece of lumber, a set of connected circuitry, or the act of getting on a plane. Protecting yourself from ambiguity requires that you define on-page resources clearly and completely.

Content marketing for semantic search isn’t terribly complicated – but it takes an organized approach to creating content using on-page markup, which connects machine-readable attributes that meet a searcher’s intent. We must find ways to describe our business without embedding repeatedly the terms most directly used to search. With the dawning of semantic search following the Hummingbird update, content marketing has become more necessary in order to put what you have to say in as many ways as you can.

Content marketing gets much more interesting and valuable, to your prospects and search engines alike, as you continue to innovate in how that content’s structured and delivered. Implementing just a few of these strategies on your most valuable content can significantly improve the effectiveness of your overall digital marketing efforts.

ClickZ Live San FranciscoThis Year’s Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) brings together the industry’s leading practitioners and marketing strategists to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, this year’scomprehensive agenda will help you maximize your marketing efforts and ROI. Register today!

 

Content Marketing and Semantic Web Optimization

Big Data Solutions For 2014

logo2 (2)

Sales Force Marketing For Big Data Solutions

 
Unique SEO Traffic Generation Wordpress SEO Plugin by SEOPressor