SEO Blogspot,SEO Freelancer,Website Design and development Blogspot,SEO Blogger,Digital Marketing Blogspot,SEO Bloggers Bangalore India.
alt =""

Tuesday 29 March 2016

Happiest Minds’ ThreatVigil Wins GOLD at the Infosecurity Product Guide’s Global Excellence Award 2016

Happiest Minds Technologies, a next generation digital transformation, infrastructure, security and product engineering services company, today, announced that ThreatVigil, its cloud-based threat management solution, won top honours in the Infosecurity Product Guide’s Global Excellence Awards. The solution was recognized in the ‘GOLD’ category for Vulnerability Assessment, Remediation and Management.

ThreatVigil is an on-Demand cloud-based threat management solution that comprises of vulnerability assessments and penetration testing of all the system components that include business applications, databases, secure network perimeters, systems and network infrastructure, mobility solutions and virtualized cloud environments. Developed with a combination of industry proven automated tools and in-depth manual assessment techniques, it is highly scalable and offers faster and simpler deployment options with no dependencies.

"We are honoured to be recognized as an industry leader yet again by the Info Security Products Guide Global Excellence Awards. We are seeing increasing global demand for our cloud-based security solutions and these recognitions from industry forums reinforce the impact we are making with our IP based solution accelerators. This award reinforces our commitment to provide an innovative and pragmatic approach for enterprises to help them protect themselves against the dynamic and emerging threat landscape,” said Prasenjit Saha, President, Infrastructure Management Services and Security Business, Happiest Minds Technologies.

The security industry celebrated its 12th Annual Global Excellence Awards in San Francisco by honouring excellence in every facet of the industry including products, industry leaders and best companies. There was participation from many well-recognized companies under 45 categories, and more than 50 judges from a broad spectrum of industry voices around the world analyzed these nominations. An average of their score determined the finalists and winners that were announced during the awards dinner and presentation attended by the finalists, judges and industry peers.

About Happiest Minds Technologies:

Happiest Minds enables Digital Transformation for enterprises and technology providers by delivering seamless customer experience, business efficiency and actionable insights through an integrated set of disruptive technologies: big data analytics, internet of things, mobility, cloud, security, unified communications, etc. Happiest Minds offers domain centric solutions applying skills, IPs and functional expertise in IT Services, Product Engineering, Infrastructure Management and Security. These services have applicability across industry sectors such as retail, consumer packaged goods, e-commerce, banking, insurance, hi-tech, engineering R&D, manufacturing, automotive and travel/transportation/hospitality.

Headquartered in Bangalore, India, Happiest Minds has operations in the US, UK, Singapore, Australia and has secured US $52.5 million Series-A funding. Its investors are JPMorgan Private Equity Group, Intel Capital and Ashok Soota.

About Info Security Products Guide:

Info Security Products Guide plays a vital role in keeping end-users informed of the choices they can make when it comes to protecting their digital resources. It is written expressly for those who are adamant on staying informed of security threats and the preventive measure they can take. You will discover a wealth of information in this guide including tomorrow's technology today, best deployment scenarios, people and technologies shaping info security and market research reports that facilitate in making the most pertinent security decisions. The Info Security Products Guide Global Excellence Awards recognize and honour excellence in all areas of information security. To learn more, visit www.infosecurityproductsguide.com and stay secured.

Media Contact:

Sunday 13 March 2016

Top 5 Reasons for investing in Customer Experience

Gone are the days when the key to success in business is determined by a premium quality product/ service, value for money and good customer service. In this age of extreme competitiveness led by disruptive technologies and the allied digital transformation services, the key to success of any business lies in the Customer Experience that you are delivering. With the widespread reach of social media and real time interactions via the internet, the room for customer expectations has become broaden. The Customer Experience in this digital transformation era represents more of a cumulative experience of multiple touch points which results in a long term real relationship between the business and the customer. But how to create and deliver the most appealing customer experience is the question the business world is facing now.

Let us have a look at the Top 5 reasons for investing in Customer Experience
  1. Drive loyalty: Enhance brand loyalty through engaging programs and gamification
  2. Increase Revenue: Develop Omnichannel experience to create multiplier effect
  3. Improve Customer Service: Understand 360 degree customer view to contextualize interactions
  4. Reduce Customer Churn: Know how to keep your customer sticking to your brand
  5. Competitive Advantage: Enhance data capturing analyzing capabilities to gain a competitive advantage

The business value of a great customer experience is enormous which prompts the global businesses for putting the strategy, funds and processes in place to build an effective customer experience practicing. Those who are making necessary changes to strategically prioritize CX will definitely win an upper edge over the competitors.

Thursday 3 March 2016

Transforming From Traditional IAM to Business Driven IAG

Providing the right people with the right access at the right time is critical in any organizational environment, irrespective of its size. In this age of explosive growth in network communications, increasing collaboration and policies like BYOD it is challenging for enterprises to determine who all have access to what resources and what they are doing with their access. A comprehensive governance control is essential to reduce the risks relating to unauthorized access, mishandling of sensitive data which can take a toll on the reputation of the organization. It is also critical to comply with governance regulations that mandate access controls.

Traditional IAM (Identity and Access Management) is focused on access management, provisioning and de-provisioning related compliance. Enterprise still struggled to meet compliance, since this is not an all-inclusive solution. It focusses more on automation of the user life cycle. Traditional IAM implementations are IT driven rather than business driven. Provisioning driven approach rarely achieve expected business value. Traditional IAM is not involved in user access review or periodic user access certification. The classic example is a user requested and granted accesses for a critical application for a temporary time period, in this aspect zero visibility on unwanted access and its usage. Governance driven IAG gives you real-time visibility into access changes.

Historically, IAM systems are used in IT organizations for managing the life cycle of user accounts in multiple systems. These systems are connected to user directories to get the user for their authentication and basic profiles such as name, title, department etc. With this information, IAM can tell who the user is, but it cannot give you information about a user’s entitlements- which is key to an application as it will decide what each user can do with application and data. The challenge with provisioning driven approach is – for e.g if a user request and get access for an access for a CRM application. If the access is controlled using a group or entitlement, traditional IAM will provision the user to entitlement, but it doesn’t provide the visibility to what the user exactly can do in CRM using this entitlement.

IAG (Identity and Access Governance) systems help business people to determine what a user can do within an application. It collects information about user identities, entitlements and roles from all applications. In addition, IAG will provide more visibility of an entitlement in applications and it will present information about each entitlement in a business context rather than technical context. This will help business managers to understand the entitlements that the users request for and this will enhance the compliance to applications.

Governance driven IAG is more concentrated on a risk driven approach. Also it is more focused on entitlement management and this can provide a more granular level of visibility of user access. This approach will enable periodic user access review and certification of user access. Governance driven IAG focusses more on the fast integration of applications across multiple platforms and provide more visibility of user access. This model ensures appropriate access for all users and ,\ automate user access review process and also simplifies the provisioning and de-provisioning problem.


In today’s complex IT landscape where solutions are dependent on multiple heterogeneous platforms and enterprise applications extend their presence into mobile and cloud space, tighter regulatory controls are required to protect the enterprise data from unauthorized access. Governance driven Identity and Access management allow organizations to review, audit and enforce policies for fine-grained access privileges across the IT environment. It can also bring in end-to-end visibility and control across all critical systems and applications – a breadth of coverage that is more efficient and reliable than traditional IAM solutions.

Wednesday 27 January 2016

How to Select the Right Set of Devices for Mobile Testing?

Mobile phones have been a great revolution of mankind. Big static land phones have slowly got obsolete and mobile phones are now an indispensable part of our everyday life.  In today’s world, almost every person owns, at least, one phone and a few have more than one. The smart phones’ entry into the market have made people go crazy and it is the gadget in which people are so dependent that without it they would feel lost. It is now very evident for almost all the industries that it is easy to reach customers via apps. Most of the mobile solutions are dependent on the new age disruptive technologies. Compared to other computing devices the reach of mobile/ smart phones is huge. With its accessibility/ availability to the high profile business magnets to a road side tea seller, mobile phones have filled up the so- called “digital divide” to a larger extent.

Challenge
From its traditional role as a mere communication tool, mobile phones have now become multipurpose gadgets used for both personal and professional uses, which creates opportunities as well as challenges like both sides of a coin. Technology shifts, proliferation of devices/ operating systems are creating challenges for hardware manufacturers and application developers in terms of developing and rolling out new products or updating it. The mobile application testing across various devices and platforms has now become even more challenging. As there are quite a lot of mobile makers in the market it is almost impossible to ensure that proper mobile testing is done on all the devices, and to a certain extent, it is not required as well. Digital modernization has encouraged people of any age group to manage their important data or images in the cloud, rely on apps that can work as reminders, use messengers to keep in touch and many more. As the wants have now become basic needs, one of the key areas that should be in focus is the Customer Experience. The user’s geo, age group, and the targeted group of customer’s info is crucial in deriving the best customer experience the user can have. However, yet another point to be remembered is that in the competitive mobility  arena ‘go to market’ time has reduced much and if you delay, someone else will take over that place. Hence, quality has to be ensured in a short span. These facts will drive anyone to be choosy about the devices when it comes to validation.

Let us have a look at the various devices that need to be considered for any mobile application testing.

Solution
By considering a couple of parameters, we can nail down on the devices on which app needs to be tested.

Parameters to be considered are –
1.    Type of devices
2.    Form Factors
3.    OS

Type of devices –
Now a days, most of the apps are made available in almost all types of devices and hence we need to ensure that the User Interface and User Experience Testing parameters for the app are met in all types of devices including phones, tablets and phablets.

Form Factor –
In most of the cases, the size of the screen is miss-understood with the resolution of the screen. These are the two exclusive parameters to be considered. Resolution is the number of pixels on the screen, irrespective of the screen size. How the app looks and objects placement on the screen are dependent on this parameter. There could be two devices of screen size 5 inch but their resolution might differ.

Mobile Testing

OS (iOS, Android, Windows and BlackBerry) –
The app is always built with the combination of new development and third party features or services, in the end product. There is a possibility where a developer would have tweaked third party features to meet the product’s requirement. Hence, we need to ensure that the combination of all these features works fine in multiple OS and their different versions that are majorly in use. There is no point in testing outdated versions of OS as the users will keep on moving towards the updated versions. But this has to though through as to which is the oldest version that we need to support for.

Matrix of devices shortlisted for testing
Note –

1.    Below information is as per Gsmarena.com.
2.    Only iOS and Android is considered in the below matrix to explain the exercise.

Type
Device Resolution OS versions Size (inch) Test Type
Android Tab Nexus 10 2560 x 1600 5.X.X 10 Func & UX
Android Tab Nexus 7 800 x 1280 5.X.X 7 UX
Android Tab Micromax Canvas Tab P470 600 x 1024 pixels 4.4.x 7 Func & UX
Android Phones Sony Xperia Z5 2160×3840 5.1.x 5.5 Func & UX
Android Phones Nexus 6P 1440X2560 6.x 5.7 Func & UX
Android Phones Samsung Galaxy S4 1080×1920 4.2.2 5 Func & UX
Android Phones Moto G 720×1280 5.1.1 5 UX
Apple Phones iPhone 5s 640 x 1136 iOS 7 4 Func & UX
Apple Phones iPhone 5s 640 x 1136 iOS 9.x 4 Func
Apple Phones iPhone 6 750 x 1334 iOS 8.x 4.7 Func & UX
Apple Phones iPhone 6s 750 x 1334 iOS 9.x 4.7 UX
Apple Phones iPhone 6s Plus 1080 x 1920 iOS 9.x 5.5 UX
Apple Tab iPad Air 2 1536 x 2048 iOS 9.x 9.7 Func & UX
Apple Tab iPad mini 2 1536 x 2048 iOS 8.x 7.9 Func

Though the above list is handpicked list of devices, it looks exhaustive and very difficult to test in all of them. The idea here is to cover all the form factors, OS and types of devices with different brands and hence the list seems to be big.

There is no shortcut if we have to validate the functionality in different OS versions and User Interface & User Experience factors in different form factors. Hence, the combination of devices and OS selection are done keeping these facts in mind. For different versions of OS, functionalities are validated to ensure that the newly developed piece of code and third party features are working fine without any functional flaws. For different Form Factors, UX parameters are validated to make sure that all the object in the screen are fitting properly  inside the screen as per the decided mock ups and there are no overlaps or partially hidden objects.

While we do functional validation on different devices it is obvious that you will make out the UI and UX glitches. So when you are testing only User Experience related scenarios you would know what is covered along with the functional testing and more focus has to be shown in the other areas.
One should always keep an eye on the market to know about the new devices or versions of OS or browsers that come to market and see if they fit into the above table. With this exercise, it is easy to arrive at the devices to be considered for testing.


Despite the short development cycles, go to market pressures and increasing competition in the mobility arena, it is key to do the mobile application testing across multiple devices and platforms and it is daunting too. Effective and timely mobile testing can enable device makers and application developers in collecting appropriate metrics that improve the overall quality of products and will be able to deliver amazing customer experiences.

Thursday 21 January 2016

Staying Afloat During a Cyber-Attack

Given the rising frequency of increasingly malicious and innovative cyber-attacks, one can safely conclude that cyber risk is here to stay. It is no longer a question of ‘if’ but ‘when’ your organization will have to deal with a cyber-attack. The cost of a cyber security breach is significant—in terms of money, business disruption and reputation. Depending on the magnitude of the attack, a cyber incident can potentially put you out of business.

The best course of action for a business that is attacked is a swift and effective response. A cyber security strategy with efficient incident response (IR) capabilities coupled with customer engagement initiatives helps limit the damage and ensures that the business is up and running as soon as possible. Reaching out and engaging with customers reassures them, and helps a business that’s dealing with a cyber-attack to regain customer confidence, and prevent defection.

An effective IR strategy navigates the following phases:

Identify
Information on events is collected from various sources such as intrusion detection systems and firewalls, and evaluated to identify deviations from the normal. Such deviations are then analyzed to check if they are sufficiently significant to be termed an event. The use of automation tools ensures swift detection and eliminates delays in moving to the containment phase. Once a deviation is identified as a security incident, the IR team is immediately notified to allow them to determine its scope, gather and document evidence, and estimate impact on operations. Businesses can bolster this process by incorporating an effective security information and event management (SIEM) system into their cyber security strategy.
Contain
Once a security event is detected and confirmed, it is essential to restrict damage by preventing its spread to other computer systems. Preventing the spread of malware involves isolating the affected systems, and rerouting the traffic to alternative servers. This helps limit the spread of the malware to other systems across the organization.

Eliminate
This step focuses on the removal of the malware from the affected systems. IR teams then conduct an analysis to find out the cause of the attack, perform detailed vulnerability assessment, and initiate action to address the vulnerabilities discovered to avert a repeat attack. A thorough scan of affected systems to eradicate latent malware is key to preventing a recurrence.

Restore
In the restoration stage, affected systems are brought back into action. While bringing the affected systems back into the production environment, adequate care should be taken to ensure that another incident does not occur. Once these systems are up and running, they are monitored to identify any deviations. The main objective is to ensure that the deficiency or the vulnerability that resulted in the incident that was just resolved does not cause a repeat incident.

Investigate
This is the last step and entails a thorough investigation of the attack to learn from the incident, and initiate remedial measures to prevent the recurrence of a similar attack. IR teams also undertake an analysis of the response to identify areas for improvement.

What enterprises need now are effective cyber security solutions to monitor and provide real-time visibility on a myriad of business applications, systems, networks and databases. There has been an increasing realization that basic protection tools for important corporate information are no longer sufficient to protect against new advanced threats. Furthermore, enterprises are under tremendous pressure to collect, review and store logs in a manner that complies with government and industry regulations.


Countering focused and targeted attacks requires a focused cyber security strategy. Organizations need to take a proactive approach to ensure that they stay secure in cyber space and adopt a robust cyber security strategy.

Monday 7 December 2015

Disruptive Technology Weekly Roundup – Dec 1st to Dec 7th

The prevention, detection and response to cyber security in 2016 will view a sea of changes, says a new report from Forrester Research. According to Forrester, the five cybersecurity predictions and resulting actions to be taken in 2016 are as follows: In this disruptive technologies era, were wearables and IoT is expected to be more prevalent, the security and risk professionals should focus and reexamine the existing security functions in through a new angle. They should consider the human factor also while addressing the security threats. The second prediction is on Governments security capabilities. The research firm has given a bleak assessment of the security capabilities of US government, which is short staffed, under-budgeted and lacking internal discipline. The third prediction was about the expected increase of security and risk spending by 5 to 10 % in 2016. Fourth comes the defense contractors’ prospective entry to private industry with claims regarding ‘Military grade’ security. However, Forrester warns private players to thoroughly watch the commercial experience and their commitment before acquiring them. The fifth prediction covers the HR department that they will bring in identity and credit protection and resolution services as an employee benefit, in this era of increasing fraud, identity theft, medical identity theft and damage to personal online reputation. Read More:

As the holiday season is coming up, the cyber security researchers in the US warns about a malware, ModPOS, which is largely undetectable by current antivirus scans. The firm also points that the malware has infected even some of the national retailers. According to the researchers, it is one of the most sophisticated point-of-sale malware with a complex framework which is capable of collecting a lot of detailed information about a company, including payment information and personal log-in credentials of executives. To address the threat, the companies need to use more advanced forms of encryption to protect consumer data. Point-to-point encryption where a consumer’s payment card data is unlocked only after it reaches the payment processor is one such effective method to combat the malware threat. Security experts warn that without such protections, even new credit cards with a chip technology known as EMV could still be compromised by infected point-of-sale systems. Read More:

The information security landscape is continuously evolving, with the proliferation of disruptive technologies like mobile, social, cloud and big data have been increasingly impacting protection strategies. In-depth strategies to monitor, analyse and report security incidents is paramount to deliver an effective enterprise security risk management profile. Happiest Minds with our deep expertise in security arena along with a large pool of experienced security professionals brings in security solutions that address the key challenges faced by enterprises today. Our services aim to improve the agility, flexibility and cost effectiveness of the next generation needs of information security and compliance programs.

How Do You Solve a Problem Like Cyber Security?

Happiest Minds UK discusses the new-age deception technologies UK businesses should adopt to bolster theircyber-security defences
The recent TalkTalk cyber-security breach has brought the issue of security firmly back into the public’s psyche and has put both government and organisations on high alert. It seems that regardless of your vertical market, be it finance, technology or banking, the threat of a cyber breach is pretty much imminent. Only today I read an article which outlined that Britain’s Trident nuclear weapons system may be vulnerable to cyber-attack by a hostile state, according to former defence secretary Des Brown.
So, despite the UK being one of the highest EU spenders on IT security, existing cyber security solutions are simply not good enough to stop malicious hackers and evolving threats. It’s little wonder why Chancellor George Osborne has pledged to spend an additional £1.9 billion on cyber security and has committed to the creation of a ‘National Cyber Centre’ to respond to major attacks on Britain.
So, how do you solve a problem like cyber security? Well, the answer could well be to implement emerging deception technologies such as next-generation honeypots and decoy systems which, according to a new Gartner report entitled ‘Emerging Technology Analysis: Deception Techniques and Technologies Create Security Technology Business Opportunities’, could have a game changing impact on enterprise security strategies.
Deception technologies are effectively tools which deceive attackers and enable the identification and capture of malware at point of entry. They misdirect intruders and disrupt their activities at multiple points along the attack chain by luring them towards fake or non-existent data and away from the organisations critical data.
Let us look at a few of these technologies in greater detail:
Honeypots—or software emulations of an application or server—have been around for a few years now. A honeypot works by offering ‘honey’, something that appears attractive to an attacker, who will then expend his resources and time on gathering the honey. In the meanwhile, the honeypot does an admirable job of drawing his attention away from the actual data it seeks to protect.
Decoys are similar to honeypots and cause the attacker to pursue the wrong (fake) information. Many decoys act together to fill the attacker’s radar in a manner as to render it difficult for him to differentiate between real and fake targets.
However, organisations are now looking for more active defence strategies that not only lure in attackers, but also trap them, confound them and track their activity. One such deception technology offers an emulation engine masquerading as a run-of-the-mill operating system. The ‘operating system’ contains ‘sensitive’ data that could be attractive to attackers, for example data labelled ‘credit card info’. The platform will lure the attacker in by allowing him to ‘hack’ this fake data and in turn start gathering information about his movements and the codes that he seeks to modify. This intelligence can then be shared with other security tools, such as intrusion prevention systems, to defend against the attack.
A number of start-ups are designing various kinds of intrusion deception software that insert fake server files and URLs into applications. These traps are visible only to hackers and not normal users. An example of such a snare could be trapping hackers probing for random files by granting them access to bogus files that are a dead-end and merely keep leading them in circles towards more fake data. Or protecting the system against brute-force authentication by scrambling the attacker’s input so he can never get the password right, even if he does happen to type out the right code.
Other technologies set up fake IP addresses on webservers that, on multiple attempts to hack them, will always present a deception to that user. Other companies set up virtual systems or computers that actually have no data on them, and are indistinguishable from other machines on the network. Repeated intrusion into and unwarranted activity on these systems make it easy to identify hackers. The hackers’ movements and methods can then be analysed, and the data fed back into other threat detection solutions and tools.
Deception technologies therefore create baits or decoys that attract and deceive attackers, making it quicker for an organisation to detect a security breach. They increase the attacker’s workload and exhaust his resources. Certain solutions go beyond merely setting up decoys to also conduct forensic analysis on these attacks so the organisation can effectively defend its network and speedily mitigate security breaches. It may not be a ‘one size fits all’ answer to the cyber security conundrum, but it is certainly one more weapon in the organisation’s armory against hackers.

Wednesday 25 November 2015

Taming the Elephant....Building a Big Data & Analytics Practice - Part I

A couple of decades ago, the data and information management landscape was significantly different. Though the core concepts of Analytics, in a large sense, has not changed dramatically, adoption and the ease of analytical model development has taken a paradigm shift in recent years. Traditional Analytics adoption has grown exponentially and Big Data Analytics needs additional and newer skills.

For further elaboration, we need to go back in time and look at the journey of data. Before 1950, most of the data and information was stored in file based systems (after the discovery and use of punched cards earlier). Around 1960, Database Management Systems (DBMS) became a reality with the introduction of hierarchical database system like IBM Information Management and thereafter the network database system like Raima Database Manager (RDM). Then came Dr. Codds Normal Forms and the Relational Model. Small scale relational databases (mostly single user initially) like DBase, Access and FoxPro started gaining popularity.

With System R from IBM (later becoming the widely used Structured Query Language  database from which IBM DB2 was created), and ACID (Atomicity, Consistency, Isolation, Durability) compliant Ingres Databases getting released, commercialization of multi-user RDBMS became a reality with Oracle and Sybase (now acquired by SAP) databases coming into use in the coming years. Microsoft had licensed Sybase on OS2 as SQL Server and later split with Sybase to continue on the Windows OS platform. The open source movement however continued with PostGreSQL (an Object-Relational DBMS) and MySQL (now acquired by Oracle) being released around mid 1990's. For over 2 decades, RDBMS and SQL grew to become a standard for enterprises to store and manage their data.

From 1980's, Data Warehousing systems started to evolve to store historical information to separate the overhead of Reporting and MIS from OLTP systems. With Bill Inmon's CIF model and later Ralph Kimball's popular OLAP supporting Dimensional Model (Denormalized Star & Snowflake schema) gaining popularity, metadata driven ETL & Business Intelligence tools started gaining traction, while database product strategy promoted the then lesser used ELT approach and other in-database capabilities like in-database data mining that was released in Oracle 10g.  For DWBI products and solutions, storing and managing metadata in the most efficient manner proved to be the differentiator. Data Modeling Tools started to gain importance beyond desktop and web application development. Business Rules Management technologies like ILOG JRules, FICO Blaze Advisor or Pega started to integrate with DWBI applications.

Big Data Analytics, Business Intelligence,

Once the Data Warehouses started maturing, the need for Data Quality initiatives started to rise, since most Data Warehousing development cycles would have used a subset of the production data (at times obfuscated / masked) during development and hence even if the implementation approach would have included Data Cleansing and Standardization, core DQ issues would start to emerge post production release to even at times render the warehouse unusable till the DQ issues were resolved.

Multi-domain Master Data Management (Both Operational or Analytical) / Data Governance projects started to grow in demand once organizations started to view Data as an Enterprise Asset for enabling a single version of truth to help increase business efficiency and also for both internal and at times external data monetization.  OLAP integrated with BI to provide Ad-hoc reporting besides being popular for what-if modeling and analysis in EPM / CPM implementations (Cognos TM1, Hyperion Essbase, etc.)

Analytics was primarily implemented by practitioners using SAS (1976) and SPSS (1968) for Descriptive and Predictive Analytics in a production environment and ILOG (1987) CPLEX, ARENA (2000) for Prescriptive Modeling including Optimization and Simulation. While SAS had programming components within Base SAS, SAS STAT and SAS Graph, the strategy evolved to move SAS towards a UI based modeling platform with Enterprise Miner and Enterprise Guide getting launched, products that were similar to SPSS Statistics and Clementine (later IBM PASW modeler) which were essentially UI based drag-drop-configure analytics model development software for practitioners usually having a background in Mathematics, Statistics, Economics, Operations Research, Marketing Research or Business Management. Models used sample representative data and a reduced set of factors / attributes and hence performance was not an issue till then.

Around mid of last decade, if anyone had knowledge and experience with Oracle, ERWin, Informatica and MicroStrategy or competing technologies, they could play the role of a DWBI  Technology Lead or even as an Information Architect with additional exposure & experience on designing Non Functional DW requirements including scaleability, best practices, security, etc.

Sooner, the enterprise data warehouses, now needing to store years of data, often without an archival strategy, started to grow exponentially in size. Even with optimized databases and queries, there was a drop in performance. Then came Appliances or balanced / optimized data warehouses. These were optimized database software often coupled with the operating system and custom hardware. However most appliances were only supporting vertical scaling. However, the benefits that appliances brought were rapid accessibility, rapid deployment, high availability, fault tolerance and security.
Appliances thus became the next big thing with Agile Data Warehouse migration projects being undertaken to move from RDBMS like Oracle, DB2, SQL Server to query optimized DW Appliances like Teradata,  Netezza, GreenPlum, etc. incorporating capabilities like data compression, massive parallel processing (shared nothing architecture), apart from other features. HP Vertica, which took the appliance route initially, later reverted to become a software only solution.

Initially Parallel Processing had 3 basic architectures – MPP, SMP and NUMA. MPP stands for Massive Parallel Processing, and is the most commonly implemented architecture for query intensive systems. SMP stands for Symmetric Multiprocessing and had a Shared Everything (including shared disk) Architecture while NUMA stands for Non Uniform Memory Architecture which is essentially a combination of SMP and MPP. Over a period of time, the architectures definitions became more amorphous as products kept on improvising their offerings.

While Industry and Cross-Industry packaged DWBI & Analytics Solutions became increasingly a Product and SI / Solution Partner  Strategy, end of last decade started to see increasing adoption of Open Source ETL, BI and Analytics  technologies like Talend, Pentaho, R Library, etc. adopted within industries (with the only exceptions of Pharma & Life Science and BFSI Industry groups / sectors), and in organizations where essential features and functionality were sufficient to justify the ROI on DWBI initiatives that were usually undertaken for strategic requirements and not for day to day operational intelligence or for  insight driven additional or new revenue generation.

Also, cloud based platforms and  solutions adoption and even DWBI and Analytics application development on  private or public cloud platforms like Amazon, Azure, etc. (IBM has now come with BlueMix and DashDB as an alternate) started to grow as part of either a start-up strategy or cost optimization initiative of Small and Medium Businesses and even in some large enterprises as an exploratory initiative, given confidence on data security.

Visualization Software also started to emerge and carve a niche, growing in increasing relevance mostly as a complementary solution to the IT dependent Enterprise Reporting Platforms. The Visualization products were business driven, unlike technology forward enterprise BI platforms that could also provide self-service, mobile dashboards, write-back, collaboration, etc. but had multiple components with complex integration and pricing at times.

Hence while traditional enterprise BI platforms had a data driven "Bottom Up" product strategy, with dependence and control with the IT team, Visualization Software took a business driven "Top Down" Product Strategy, empowering business users to analyze data on their own and create their own dashboards with minimal or no support from the IT department.

With capabilities like geospatial visualization, in-memory analytics, data blending, etc. visualization software like Tableau is increasingly growing in acceptance. Some others have blended Visualization with out-of-box Analytics like TIBCO Spotfire and in recent years SAS Visual Analytics, a capability which otherwise is achieved in Visualization tools mostly by integrating with R.

All of the above was manageable with reasonable flexibility and continuity till data was more or less structured and ECM tools were used to take care of documents and EAI technologies were used mostly for real-time integration and complex event processing between Applications / Transactional Systems.

But a few year ago, Digital platforms including Social, Mobile and other platforms like IOT/M2M started to grow in relevance and Big Data Analytics grew beyond being POCs undertaken as an experiment to thereafter complement an enterprise data warehouse (along with enterprise search capabilities), to at times even replace them. The data explosion gave rise to the 3 V dilemma of velocity, volume and variety and now data was available in all possible forms and in newer formats like JSON, BSON, etc. which had to be stored and transformed real-time.

Analytics had to be now done over millions of data in motion unlike the traditional end of day analytics over data at rest. Business Intelligence including Reporting and Monitoring, Alerts and even Visualization had to become real-time. Even the consumption of analytics models now needed to be real-time as in the case of  customer recommendations and personalization, trying to leverage smallest windows of opportunity to up-sell / cross-sell to customers.

It is Artificial Intelligence systems, powered by Big Data that is becoming the game changer in the near future and it is Google, IBM and others like Honda who are leading the way in this direction.
To be continued.........

5 Ways to Secure the Public Cloud

As cloud computing becomes more sophisticated and mainstream, the shift to the public cloud is gaining tremendous traction. With big-brand clouds (Amazon Web Services, Google Cloud Platform and Microsoft Azure) fast evolving, more and more enterprises are moving away from private clouds. However security is justifiably a top concern when moving applications and data into the public cloud. Some of the questions foremost on everyone’s mind are - How secure is my data? What will happen is there is a breach with the public cloud vendor? How do I ensure that my data is properly protected in this case?

Security is ultimately a shared responsibility between the company and the public cloud vendor.  According to Forrester, cloud success comes from mastering the “uneven handshake”. While cloud vendors are typically responsible for securing the data center, infrastructure and hypervisor, the onus is on you, as a consumer to close this gap with the necessary OS, users, applications, data and of course, security – in tandem with the vendor.

Journeying to the Public Cloud

The key is to find a cloud provider that fits best for your business. This means you need to thoroughly vet potential vendors and conduct a full risk assessment prior to signing any contract. Considering the fact that different cloud service providers provide varying levels of security, it is best to look at their security and compliance activities and choose one with transparent processes. Once this decision has been made, the next step is to take into account the various security risks and chart possible solutions to create a secure cloud environment.

Here are 5 steps to best protect data in the public cloud:

Intelligent Encryption

Encryption is a viral security component of any organization and it is all the more important when transferring and storing sensitive data in the cloud. It ensures data confidentiality thus mitigating the risk of data loss or theft in the case of a breach in the cloud. This focus on the data itself rather than placing full emphasis on the infrastructure for protection goes a long way in ensuring that data stays safe even if the network or perimeter security is compromised.
security and compliance
Strict Identity Management and Access Control

An effective identity management strategy for the cloud can be summed under the three ‘As’ – access, authentication and authorization. Consumers must ensure that only trusted and authorized users can access the public cloud data through a strong identity management system. Additional layers of authentication measures further help in ensuring a controlled cloud environment. An important note here is to find a good balance between security and developer performance.

Smart Security at All End-points

In most cases, physical security is usually covered by the cloud provider through regular audits and certifications from accreditation bodies. In certain industries like healthcare, finance and defense, it is a regulatory mandate that there be security at all points along the data path – be it entering or exiting the corporate network or moving along to the cloud and in the cloud itself. However as a general trend in today’s cloud and BYOD era, it is of utmost importance that the consumer ensures some hardware necessities and best practices for end-point security in addition to the cloud security measures. Mobile devices in particular pose a unique challenge as despite best intentions, users generally do not prioritize securing them. Unfortunately, this results in exposing potential access points to sensitive corporate data. Strong end-point measures typically should encompass mobile/on-device protection, next generation firewalls, network intrusion systems, VPN and up-to-data security architectures.

Real-time Monitoring & Incident Response

As part of the shift to a “prevent and control attack” mindset, real-time monitoring through analytics and forensics enables consumers to identify attacks early in the breach lifecycle. Instant alerts and automatic data collection through analytics enables rapid forensics and insights into behavior from endpoint to the cloud. Armed with these insights, security team can identify potential risks and patterns in real-time, while also determining the path for on immediate remediation. Organizations should also focus on enterprise level visibility for hosted applications in the cloud in conjunction with the cloud provider, thus providing a multi-pronged approach for quick detection and incident response for security issues.

Strong Governance Framework

A governance framework is an essential tool that will enable your IT security team to assess and manage all risks, security and compliance related to the organization’s cloud environment. This crux of this framework is that it needs a synergy between security, IT, business and the organization itself for a secure cloud. A strong framework typically encompasses stringent security policies, audit compliance, identity management, security control tools, a BYOD policy and a contingency plan. But to ensure true compliance with cloud policies, organizations have to work closely with IT security teams to understand the unique challenges of cloud security and ways to protect sensitive data workloads. Additionally, educating and training users to comply with the organization’s cloud policies can go a long way in achieving compliance.

Cloud computing is revolutionizing the way enterprises operate in today’s world with a slew of cost benefits and tremendous economies of scale. As with any other investment, it is your responsibility to ensure that cloud is protected as much as possible. With a robust set of security processes, tools, a clear BYOD-compatible cloud computing strategy and a strong governance framework in place, there is a significant reduction in risk as you embark into the cloud. And the future is yours as long as your organization continuously adapts to stay agile and competitive in a fast evolving cloud technology landscape.

Cyber Threat Intelligence – What is needed?

Cyber Threat Intelligence (CTI) is a term used to address any kind of information that protects your organization’s IT assets from potential security impeachment. CTI can take many forms. It could be internet based IP addresses or geo locations TTP’s (Tools, Tactics and Practices). These work as indicators or early warnings of attacks which can take a toll on an enterprise’s IT infrastructure. There are numerous vendors across the globe whose CTI can be seamlessly made part of security interfaces like GRC tools, SIEM and other correlation engines. That being said, what information can be employed to generate actionable CTI to defend your enterprise security? Let’s look at the same in detail:
Drivers:
Drivers may vary anything from attacks like a ‘zero day’, business related breaking news, or certain announcements that cause vulnerabilities in the enterprise’s activities. Understanding the nature of the drivers can help increase the security vigilance.

Prerequisites:
This accounts for everything an attacker would need to trigger an attack on your IT infrastructure through intranet perimeter, network, endpoints and just about anything that is exposed to internet.
Capabilities:
The script Kidde’s could generate an attack but may not possess the capacity of post-attack activities. Or a professional attacker could have the capabilities of penetrating an attack but its defense mechanism may not be able to stop provide the attacker with intended results. Understanding the capabilities of the attacks and the attackers in absolute length can help defend security to a great extent.
Components:
Another element to considered to better equip security concerns is keeping an account of the attacking component’s tools, tactics and procedures that were used in the past attacks conducted by the attacker. This would help generate indicators to better prepare for the forthcoming attacks.
Measurement:
Measurement is important to determine the impact of the attack, mostly in terms of number and types of security events which are generated during the pre-attack condition. The more ways we can interpret different natures and depths of these measurements, the more the security interface can work on the counter-attack measures and recovery processes.
There are many security dimensions that when considered carefully can help avoid, tackle, monitor and help recovery of a security impeachment. While the aforementioned are a hand few, the list can get a lot longer to include threat vectors, compromise parameters, defense mechanism techniques, business impact analytics, attack patterns from the past, zero day detection, security control bypassing, post compromise information, etc.. The more we include these factors, the better IT security vigilance gets.

Cyber Threat Intelligence – What is needed?

Cyber Threat Intelligence (CTI) is a term used to address any kind of information that protects your organization’s IT assets from potential security impeachment. CTI can take many forms. It could be internet based IP addresses or geo locations TTP’s (Tools, Tactics and Practices). These work as indicators or early warnings of attacks which can take a toll on an enterprise’s IT infrastructure. There are numerous vendors across the globe whose CTI can be seamlessly made part of security interfaces like GRC tools, SIEM and other correlation engines. That being said, what information can be employed to generate actionable CTI to defend your enterprise security? Let’s look at the same in detail:
Drivers:
Drivers may vary anything from attacks like a ‘zero day’, business related breaking news, or certain announcements that cause vulnerabilities in the enterprise’s activities. Understanding the nature of the drivers can help increase the security vigilance.

Prerequisites:
This accounts for everything an attacker would need to trigger an attack on your IT infrastructure through intranet perimeter, network, endpoints and just about anything that is exposed to internet.
Capabilities:
The script Kidde’s could generate an attack but may not possess the capacity of post-attack activities. Or a professional attacker could have the capabilities of penetrating an attack but its defense mechanism may not be able to stop provide the attacker with intended results. Understanding the capabilities of the attacks and the attackers in absolute length can help defend security to a great extent.
Components:
Another element to considered to better equip security concerns is keeping an account of the attacking component’s tools, tactics and procedures that were used in the past attacks conducted by the attacker. This would help generate indicators to better prepare for the forthcoming attacks.
Measurement:
Measurement is important to determine the impact of the attack, mostly in terms of number and types of security events which are generated during the pre-attack condition. The more ways we can interpret different natures and depths of these measurements, the more the security interface can work on the counter-attack measures and recovery processes.
There are many security dimensions that when considered carefully can help avoid, tackle, monitor and help recovery of a security impeachment. While the aforementioned are a hand few, the list can get a lot longer to include threat vectors, compromise parameters, defense mechanism techniques, business impact analytics, attack patterns from the past, zero day detection, security control bypassing, post compromise information, etc.. The more we include these factors, the better IT security vigilance gets.

Wednesday 18 November 2015

Store as Fulfillment Center: Omnichannel and the Future of Retail

Omnichannel has come of age for brick-and-mortar retailers.
Traditional retailers have been on a slow yet steady adoption of digital technologies over the last two decades. First arrived e-commerce, which retailers took on as another channel for customer acquisition and sales. Coupled with this emerged online-only players opening up new avenues of fulfillment. Then came smartphones, setting a new paradigm of customer experiences.
Today, with the faster evolution of technology and ever-increasing consumerization, there is a demand for ultimate flexibility and innovation. Customers expect to be recognized and pampered, and they switch loyalty for the smallest of added perceived value – be it monetary based, convenience based, or experience based.
Brick-and-mortar retailers with an established national and/or international store network are specifically suited to meet the customers of today where they are – online, on mobile, in a physical store, or even in a subway station.  These phy-digital retailers can and must strive for true omnichannel – seamless, connected, and personalized experiences irrespective of how and where their customers shop.
 Omnichannel and the Future of Retail
The Potential of a Store
Despite the increasing adoption of digital shopping, it remains a fact that, for bricks-and-clicks retailers, over 90 percent of revenues are from their physical stores and the store, therefore, continues to be nerve center of operations. It is important to realize the true potential of the huge store network for such retailers.
Stores can transform to be experience centers for omnichannel customers. Here are a few solutions that can bring transformational experiences in-store:
  • Experiential kiosks and digital displays
  •  Digital signage
  •  In-store IoT/ beacon-based personalized experiences
  •  Customer engagement driven by data insights
Stores can be mini-fulfilment hubs, offering ultimate flexibility when it comes to delivery choices and saving a potentially lost sale. Examples of such initiatives include the following:
  • Order online to pick up in store or at curb side, fulfilled from store or warehouse
  •  Order in-store for home delivery, from a warehouse, same store, or another store
  •  Order in-store for pick up from store, from same store or another store
When armed with right tools and technologies, store associates can be brand ambassadors, driving customer loyalty and improving customer retention. For example, when a store associate is asked a question about a salmon pink shirt that was found online but is not in stock in store, the store associate should be incentivized and have the tools to check inventories of nearby stores or the distribution center. Further, the associate should be empowered to take the order for shipping this product to customer’s home at no extra charge the next day.
It’s a no-brainer that omnichannel retailers must invest in technologies that deliver the data to drive store-transformation initiatives.
Implications for Brick-and-Mortar Retailers
For a complete omnichannel transformation to be successful over next two to three years, the foundation has to be strong. It starts with a data-driven, single view of the customer, orders, inventory, products, etc. and a scalable architecture to support dynamic changes in business.
  • To enable an endless aisle of products not limited to a store’s physical space, a global product catalog should be available across channels, including your extended supply network and drop-ship vendors.
  •  To enable stores to be fulfilment hubs, a real-time and reliable view of inventory data should be available across the entire supply network.
  •  And for personalization to click, a 360-degree view of customers’ online orders, store transactions, social engagement, lifetime value, loyalty history including open orders, queries, and complaints is a must.
Orchestrate transformational customer journeys. Decoding retail customer journeys is the starting point to digital transformation. In the era of design thinking and customer experience, a new paradigm of solution design is evolving. Yes, there are beacons, there is big data, there is fast data, there are mobile technologies and cloud applications that promise Nirvana. However, to get transformational business outcomes, there is a need for careful curation of experiences.
Bricks-and-clicks retailers must orchestrate an end-to-end experience that is beyond a pointed technology solution to solve a particular problem like knowing what the customer did on the website or what she purchased in a store. It is about bringing all the insights and business states about products, customers, and even assets like dressing rooms to curate a new digital journey for the customer in-store.
Empower store associates. Retailers must realize the importance of their associates as omnichannel evangelists who can make or break seamless experiences for the customer. Initiatives to incentivize cross-channel “save the sale” behavior is one key paradigm shift that retailers must consciously undergo.
The store associate must be equipped with data on products available across different distribution channels and, to be credible brand advocates, also must be as knowledgeable as her customer. She needs the right technology to have access to meaningful insights on her customer in order to offer a personalized experience. Tools and technologies that can provide data that deliver in-the-moment, 360-degree views on customers, enterprise-level inventory data, mobile point of sale, and in-built intelligence to provide the right recommendations (product recommendations, substitutes, alternate fulfillment options, dynamic offers) are critical for associate empowerment.
The benefits of executing well on all the above initiatives are increased footfalls, increased conversions with a multiplier effect across channels and, most importantly, increased customer loyalty and retention.