IBM’s McAfee-as-a-service cloudy antivirus wobbled for nearly a day

Updated IBM’s cloud experienced an “unplanned event” that caused its McAfee-as-a-service offering to operate with sub-par performance for nearly a day.

“At approximately 0347 AM UTC on June 20, engineers with Compute Infrastructure identified a database issue that necessitated the restoration of a key update repository for McAfee Antivirus services from backup,” read an advisory sent to customers today.

The advisory explained that some 27 IBM data centers across Europe, Asia, the USA, and Mexico, experienced disrupted anti-malware scanning.

A later advisory dated “Thursday 21-Jun-2018 01:26 UTC” noted “all Mcafee services have been returned to operational status, we apologize for the inconvenience.”

IBM touts McAfee-as-a-service as part of a portfolio of offerings to help its customers secure and defend their rented servers in the Big Blue cloud.

But that help wasn’t very helpful because the first advisory said “Customers in the listed regions may experience difficulty updating existing McAfee services, provisioning new McAfee services, or performing scheduled scans or maintenance on McAfee services.”

Yup, you read that right: if you relied on IBM to host McAfee antivirus that protects your cloudy stuff, it was probably not be up to the job of scanning for viruses right now and/or might not have been able to slurp new virus-squishing updates. Users would also have been unable to do the “hey, the cloud lets me run up new servers whenever I want to” thing, because new services weren’t starting reliably.

Needless to say this is not an optimal mode of operation for antivirus software, and just isn’t the sort of thing that a cloud or SaaS operator is supposed to do.

It gets worse: IBM said the incident was spotted at approximately “0347AM UTC on 6-20-2018”. The final notice came nearly 22 hours later.

Read More Here

Article Credit: The Register

Go to Source

Test Systems Better, IBM tells UK IT meltdown bank TSB

Updated A report into the IT meltdown at TSB has suggested the British bank did not carry out rigorous enough testing and that the problems went beyond previously reported middleware issues.

The chaos at the bank, a subsidiary of the Spanish Sabadell Group, saw many customers unable to access services for a week at the end of April after the bank bodged a long-planned migration off its former parent firm Lloyds Banking Group’s systems. The IT issues were compounded by a wave of scams and an underwhelming response from execs.

Amid the crisis, the bank hired IBM in a systems integration role to identify and resolve the problems – which the bank’s CEO Paul Pester told MPs was due to issues with its middleware systems.

Big Blue produced a short presentation for the bank four days after being appointed, offering a “preliminary work plan with very early hypotheses” – and the Treasury Committee has today published the slide deck.

In it, IBM suggested that the bank’s testing was not up to scratch, saying it “has not seen evidence of the application of a rigorous set of go-live criteria to prove production readiness”.

Emphasising the scale and complexity of the project, IBM said that a firm would need “longer than normal to prove the platform through incremental customer take-on to observe and mitigate any operational risks” – and warned that such projects bring a broad range of hard-to-diagnose technical and functional problems.

“To address this risk profile, IBM would expect world class design rigour, test discipline, comprehensive operational proving, cut-over trial runs and operational support set-up,” it said.

However, a set of bullet points suggest that this was not the case – or that TSB was not able to demonstrate this to IBM.

“Performance testing did not provide the required evidence of capacity and the lack of active-active test environments have materialised risk due to issues with global load balancing (GLB) across data centres,” IBM stated.

It said that a “limited number of services” – including mortgage origination and ATM and head office functions – had been launched on the new platform and a broader set of services to about 2,000 TSB partners.

The integrator added that it “has not seen evidence of technical information available to TSB”, such as architectures, configuration and design documents, test outcomes or monitoring information.

Read More Here

Article Credit: The Register

Go to Source

IBM Boosts Support of Cloud-Native Capabilities

IBM (NYSE:  IBM) has announced that for the first time, its industry-leading IBM Cloud Private application platform, will be made available to run on the company’s enterprise Cloud Managed Services (CMS).

Hundreds of world-leading banks, airlines and government organizations already rely on IBM Cloud Managed Services for managing enterprise applications on security-rich clouds. IBM is now enabling its CMS customers to help extend their capabilities and maximise existing investments by synching with IBM Cloud Private – a new solution based on Kubernetes containers designed to rapidly build, modernize and deploy applications in client-managed environments.

By bringing the two together, clients taking advantage of the security rich and production-ready cloud environment that CMS offers, can now reap the benefits of the latest cloud-native DevOps capabilities of IBM Cloud Private, accelerating time to market for innovative, cloud-native applications.

“We see that organizations are aggressively transforming to respond faster to changing market, customer and competitive demands,” said Mark Slaga, GM for IBM Cloud Managed Services. “To support them in their journeys, IBM is continuously innovating and adapting its cloud offerings. By combining the power of IBM Cloud Managed Services and IBM Cloud Private, customers will be able to benefit from a highly-secure and versatile environment for managing and modernizing apps and turning decades of data into competitive advantage.”

Launched late last year, IBM Cloud Private is front and center of IBM’s hybrid cloud strategy. Thanks to the flexibility of container technology, it can be installed on a wide range of enterprise systems to create a private cloud with an architecture and capabilities consistent with the public IBM Cloud. IBM Cloud Private’s combination of innovation, flexibility and control is particularly important to companies in regulated industries such as finance with high standards for security and reliability.

Coupled with CMS’ built-in security, disaster recovery, and automated infrastructure and application management, IBM’s clients can now have a powerful combination that helps provide the security, flexibility and speed they need to compete in the most dynamic markets.

The solution uses the power of Kubernetes container technology to add self-service Platform-as-a-Service capabilities to IBM’s managed CMS cloud infrastructure. In addition, IBM’s CMS clients can benefit from faster access to IBM’s analytics, data, middleware and Watson portfolios available through the IBM Cloud Private platform.

Read More Here

Article Credit: Cision

Go to Source

How To Successfully Present ERP To The Board

Stating to the board “we’re doing ERP” risks starting with the nebulous and descending into the ineffectual.

Traditionally, ERP was about cost reduction, operating efficiency, standardisation and centralisation, without focusing on opportunities to grow. Today, however, the board’s focus is on digital business rather than cost savings.

Unless ERP strategies change to embrace and enable digital business, they – and the CIOs that propose them – will be relegated to a back office, low relevance activity.

ERP-board

Presenting to the board can be a daunting task. Many a talented CIO has been tripped up by questions and statements from board members that appear to come from left field. When under pressure, it’s normal to resort to what’s comfortable – vendors, technologies and buzzwords.

The board wants to hear how an ERP strategy will deliver measurable business impact and provide a renovated core for digital business. This is where a postmodern ERP strategy comes in, addressing both existing investments and new ERP technologies, to deliver measurable business benefits. Presenting a back office, low relevance strategy that bemoans a historic lack of investment in IT portends a poor outcome.

The board cares primarily about only one thing: shareholder value. You need to communicate an understanding of the business context and how the company will be competitive. For public sector organisations, the focus should be mission enhancement rather than competitive advantage.

There’s often a strange paradox in that boards expect CIOs to be business focused, strategic and succinct. Yet when presented with an ERP strategy in such terms, CIOs may shift the focus to vendors and technologies since that’s what the board’s accustomed to talking to CIOs about.

Garner support and guidance

Executing an ERP strategy is an ongoing stream of decisions and changes to business processes. These can’t be resolved properly without strong executive management and board support.

Lack of executive leadership is one of the primary reasons ERP programs fail. Executives will look to the board for support and guidance. Their support of the agreed strategy reinforces its importance throughout the organisation.

It’s critical that you know who the digital influencers are on the board and throughout the organisation. A successful presentation to the board isn’t just about completing a check-the-box exercise and gaining funding. It’s about getting support throughout the program.

Help the board understand that ERP should be treated as an ongoing strategy, not as an individual one-time project. The strategy will contain multiple programmes and projects delivering measurable business value over time.

Don’t focus on ‘doing ERP’

Stating “we’re doing ERP” is problematic for several reasons. By itself, the term “ERP” no longer provides a level of understanding and/or clarity to an organisation. Over the years, the term has been used ubiquitously to mean any type of back-end business system in any industry.

In addition, ERP isn’t a single thing, application or vendor. Value doesn’t derive from buying software and simply implementing it. The value comes from the adoption of new and/or improved ways of working. It also comes from the development of a strong core system of record that can be exposed to systems of differentiation and innovation.

Gartner believes that CIOs who take a business strategy first approach to ERP will deliver 60 per cent increased business value over those who take a vendor first approach. Success and the realisation of business value require strategy, organisational change leadership and effective governance. Technology by itself isn’t enough.

Board members may have past experience with a variety of projects and programmes that were categorised as ERP. These experiences, good and bad, may colour their view of this current initiative.

It’s essential that your presentation clearly identifies what information you want the board to understand and what outcome you need from it. Do you require the board’s support? Do you require funding? Executive sponsorship? Or is this presentation just to provide background information for future discussions?

Board meetings can be quite contentious affairs. Read the room quickly and adjust your pitch accordingly. Many CIOs don’t sit on the board and this ERP presentation may be only one item on a very busy agenda.

Present a vision of the future

Clearly state what’s happening in business terms that requires action at this point in time. Traditional ERP presentations often focus on risk, rather than reward.

One overused approach in monolithic ERP initiatives employed the concept of a “burning platform”: a sense of urgency around a dire situation, with the goal of scaring the board into action by linking the ERP initiative to a veiled threat: “If we don’t do this, bad things will happen!”

Boards don’t scare easily. They’re often composed of seasoned business professionals who’ve “been there and done that”.They’ve heard many CIOs make prophecies of doom in the past that didn’t turn out to be true.

Present to the board a vision and narrative of the future that provides a positive view of the reasons behind the strategy, and an understanding of the objectives and benefits.

Go to Source

Three Powerful Ways Big Data Can Grow Your Leadership Career

During my short stint in corporate America, I held several cross-functional roles that drew on my expertise in both business analytics and quality assurance. I usually worked alongside or in direct support of one or more project managers on various agile development projects. My primary responsibilities included acting as a liaison between senior leadership and the IT team to elicit, gather, define, document and test business and functional requirements. These experiences provided me with a firsthand account of how important utilizing big data can be to leadership career growth and development.

Big data is used to identify patterns and trends that can yield powerful insights into human interactions, especially consumer behavior. This data can include demographic, geographic and psychographic attributes collected from various sources throughout the consumer life cycle as well as from other areas of each individual’s life.

The idea of big data can seem scary. Data is being collected from everywhere all the time and leaders are expected to know how and when to use it. This newfound responsibility also means requirements to adapt to new technologies and ever-evolving policies and regulations around security and compliance issues.

Despite these concerns, the most successful leaders are learning to embrace big data as a catalyst to up-level their careers in three powerful ways.

1. Become a better decision maker.

Historically, intuition has been a highly favored attribute among leaders. Unfortunately, instinct can only take one so far. Access to big data analytics allows leaders to make fact-based decisions rather than those driven by emotion and belief. It is much better to know something to be true rather than to simply believe it to be true with no other basis for such belief than past experiences. Fact-based decisions have fewer risks, and leaders have an easier time isolating root causes of specific problems.

An executive coaching client, in the role of fund development manager for a local nonprofit organization, used big data analytics to identify trends in giving among high net worth donors. Armed with the compiled information, she was able to launch a new campaign that focused on the specific needs of this elite target market and increase their annual giving by 30%, (a well over seven-figure gain) in the following year. Imagine adding this impressive accomplishment to your curriculum vitae.

Read More Here

Article Credit: Forbes

Go to Source

Big data may only offer ‘fuzzy snapshot’ of health

When it comes to understanding what makes people tick—and get sick—medical science has long assumed that the bigger the sample of human subjects, the better. New research suggests this big-data approach may be wildly off the mark.

That’s largely because emotions, behavior, and physiology vary markedly from one person to the next and one moment to the next. So averaging out data from a large group of human subjects at a given instant offers only a snapshot, and a fuzzy one at that, researchers say.

The findings, published this week in the Proceedings of the National Academy of Sciences, have implications for everything from mining social media data to customizing health therapies, and could change the way researchers and clinicians analyze, diagnose, and treat mental and physical disorders.

“If you want to know what individuals feel or how they become sick, you have to conduct research on individuals, not on groups,” says study lead author Aaron Fisher, an assistant professor of psychology at the University of California, Berkeley. “Diseases, mental disorders, emotions, and behaviors are expressed within individual people, over time. A snapshot of many people at one moment in time can’t capture these phenomena.”

Moreover, the consequences of continuing to rely on group data in the medical, social, and behavioral sciences include misdiagnoses, prescribing the wrong treatments, and generally perpetuating scientific theory and experimentation that is not properly calibrated to the differences between individuals, Fisher says.

That said, a fix is within reach: “People shouldn’t necessarily lose faith in medical or social science,” he says. “Instead, they should see the potential to conduct scientific studies as a part of routine care. This is how we can truly personalize medicine.”

Plus, he notes, “modern technologies allow us to collect many observations per person relatively easily, and modern computing makes the analysis of these data possible in ways that were not possible in the past.”

Fisher and fellow researchers used statistical models to compare data collected on hundreds of people, including healthy individuals and those with disorders ranging from depression and anxiety to post-traumatic stress disorder and panic disorder.

In six separate studies they analyzed data via online and smartphone self-report surveys, as well as electrocardiogram tests to measure heart rates. The results consistently showed that what’s true for the group is not necessarily true for the individual.

Read More Here

Article Credit: Futurity

Go to Source

Everything big data claims to know about you could be wrong

When it comes to understanding what makes people tick — and get sick — medical science has long assumed that the bigger the sample of human subjects, the better. But new research led by the University of California, Berkeley, suggests this big-data approach may be wildly off the mark.

That’s largely because emotions, behavior and physiology vary markedly from one person to the next and one moment to the next. So averaging out data collected from a large group of human subjects at a given instant offers only a snapshot, and a fuzzy one at that, researchers said.

The findings, published this week in the Proceedings of the National Academy of Sciences journal, have implications for everything from mining social media data to customizing health therapies, and could change the way researchers and clinicians analyze, diagnose and treat mental and physical disorders.

“If you want to know what individuals feel or how they become sick, you have to conduct research on individuals, not on groups,” said study lead author Aaron Fisher, an assistant professor of psychology at UC Berkeley. “Diseases, mental disorders, emotions, and behaviors are expressed within individual people, over time. A snapshot of many people at one moment in time can’t capture these phenomena.”

Moreover, the consequences of continuing to rely on group data in the medical, social and behavioral sciences include misdiagnoses, prescribing the wrong treatments and generally perpetuating scientific theory and experimentation that is not properly calibrated to the differences between individuals, Fisher said.

That said, a fix is within reach: “People shouldn’t necessarily lose faith in medical or social science,” he said. “Instead, they should see the potential to conduct scientific studies as a part of routine care. This is how we can truly personalize medicine.”

Plus, he noted, “modern technologies allow us to collect many observations per person relatively easily, and modern computing makes the analysis of these data possible in ways that were not possible in the past.”

Fisher and fellow researchers at Drexel University in Philadelphia and the University of Groningen in the Netherlands used statistical models to compare data collected on hundreds of people, including healthy individuals and those with disorders ranging from depression and anxiety to post-traumatic stress disorder and panic disorder.

In six separate studies they analyzed data via online and smartphone self-report surveys, as well as electrocardiogram tests to measure heart rates. The results consistently showed that what’s true for the group is not necessarily true for the individual.

For example, a group analysis of people with depression found that they worry a great deal. But when the same analysis was applied to each individual in that group, researchers discovered wide variations that ranged from zero worrying to agonizing well above the group average.

Moreover, in looking at the correlation between fear and avoidance — a common association in group research — they found that for many individuals, fear did not cause them to avoid certain activities, or vice versa.

Read More Here

Article Credit: SD

Go to Source

New SYSPRO Survey Shows Next-Gen ERP Technology Users “Want IT Their Way”

SYSPRO, a global provider of industry-built ERP software, has released a survey of technology users that indicates millennial and future generations of workers will have very specific demands on business software based on their preferred use of personal devices. Specifically, more than 76% of respondents said next-generation users will be drawn to ERP vendors that let them do business “their way.” The survey was released to 25 leading industry analysts as part of SYSPRO’s summer 2018 technology roadshow. See news release also issued today: “SYSPRO Once Again Leads the Way by Delivering Practical ERP Business Solutions for Leveraging IoT Devices and an AI-Infused Interface.”

survey-syspro

In fact, when asked “What do you think would be the single biggest morale ‘buzz kill’ to a millennial or newer ERP software user,” nearly 3-to-1 selected “inflexibility” as being the most problematic, versus concerns over limited feature/functionality or poor service. Said one respondent: “This is the now generation coming up and they won’t wait, they won’t do steps without understanding the purpose of the steps. If not important, they will eliminate as many steps as possible – not out of laziness, but out of speed and efficiency.” Seventy-one (71%) percent of the survey respondents believe speed and efficiency will be the top priority of millennials and newer users.

survey-syspro

The SYSPRO survey, which the company calls SNAP (SYSPRO Needs Answers Please), also asked respondents to rank their top priorities in new technology areas that would “get the most interest or excitement from newer ERP users.” The three areas that ranked highest for the number one priority were: cloud deployment (32.3%), big data/predictive analytics (28.1%), and social-enabled and intuitive user interfaces (22.2%).

“Based on the extensive and growing use of personal devices, SYSPRO’s technology leaders began aggressively focusing on infusing personalization and flexibility components that we believed would be most useful and preferred by the changing landscape of future ERP users,” said Brian Stein, CEO, SYSPRO USA. “Companies that select SYSPRO have discovered that they are not locked into rigid software or hardware choices that are common to many other ERP providers. Our software is very much future-proof and geared to a level of personalization that new-age users are demanding.”

Go to Source