Welcome to My Blog

The default template for Blogspot blogs displays a blog's description on each page. This gives you an opportunity to increase the density of your blog's primary keywords and keyword phrases. If you write a keyword-rich blog description, however, you may prefer for your readers not to see it while still making it available for search engines to scan. Modify your blog's template to hide the description of your blog while keeping it in the source code.

Good Day..

Showing posts with label Data. Show all posts
Showing posts with label Data. Show all posts

Wednesday, June 6, 2018

Seven Ways to Prepare Your Data Center for a Natural Disaster

You  wants to think about a disaster crippling or even destroying their data center. But even as hurricane season has ended for Atlantic and Gulf Coast states, wildfires are raging in Southern California. Earthquakes are an ever-present danger. Disaster planning is moving higher up in the priority list for many data center managers.

Disaster recovery (DR) planning typically focuses on data protection and application availability. Most organizations consider the information maintained on servers and storage devices to be infinitely more valuable than the technology itself. However, DR plans should also include provisions for protecting equipment from physical damage.

Location: Ideally, data centers would be located in a geographic area that's not prone to natural disaster. That's seldom possible, so organizations must do the best they can to isolate it from any disaster that does occur. That means locating the room in an interior room or at least as far away from windows as possible. In areas where hurricanes and tornadoes are the greatest threat, an underground location may be the best option (unless flooding is a problem). In earthquake zones, it's critical to select a well-constructed building that's compliant with the latest codes.

Backup Power: Power outages are a leading cause of equipment downtime, and UPS failure is the No. 1 cause of unplanned equipment outages. UPS should be carefully selected, implemented and maintained to ensure a steady supply of conditioned power with a regulated voltage level.

Fire Suppression: Many data centers rely on conventional sprinkler systems, but water can destroy equipment and cause other problems as well. A better approach is to employ a dry "pre-action" system that will extinguish most fires before the sprinkler system is activated. Modern fire-suppression systems use halo-carbons, which remove heat from fires, or inert gases, which deprive them of oxygen. Both can provide excellent fire suppression if the system is properly designed, installed and tested. The fire alarm should also be tested - if it is faulty, the fire-suppression system might not be activated.

Flood Control: If the data room is located in a flood-prone area, a pumping system should be installed. The system should activate automatically and be connected to generator power so that it continues operation if the electric grid goes down.

Earthquake Protection: In earthquake-prone areas, it's important to select racks and cabinets that are rated to withstand seismic activity. These units typically have special mounting brackets to attach them securely to the floor.

Flexible Processes: Data center personnel should understand their responsibilities and be thoroughly trained in DR procedures. Equipment should be monitored by at least one person at all times. Run-books should be kept up-to-date so that equipment can be recovered or reconfigured quickly in an emergency. DR processes should also be well-documented but flexibility is important. Personnel should feel empowered to make decisions and improvise based upon the situation at hand.

Test: In most organizations, the DR plan is seldom, if ever tested. The plan should be tested at least twice a year and updated as the environment and business priorities change.

These seven steps can help build design a flexible and resilient data center infrastructure and select systems that will protect your valuable equipment and efficiency.

Saturday, September 7, 2013

Designing For Data Quality Technique

The quality of data across and in your enterprise systems.

Most, if not all, data quality problems are caused by human error.

Approximately 80% of errors are simple data capture errors - users entering the wrong information - with the balance largely arising through poor data integration.

Over the last fifteen years I have delivered multiple data quality audits and assessments, in different environments and, based on my experience, suggest that a few simple design choices can have a dramatic impact on your ability to manage information quality at an holistic level.

1. Plan to capture the User and Date that information was captured, or modified.

Data profiling and discovery tools uncover interesting patterns of behavior in your systems. If this behavior can be linked to specifics users, groups, or time periods then it can be managed.

For example, we may identify that x% of our information has an incorrect link between supplier and product code. We can now go ahead and fix the problem but we have no real insight as to when, or why, it occurred. Data governance, and root cause analysis, require context for our information.

Date of Capture information gives you important context.

Is this an old problem that has subsequently been resolved?

System validation may have improved but we have been left with a legacy of erroneous, poor quality records.

Or maybe the errors can be tied back to a historical event. Do these records link back to the migration of information

from the previous ERP platform into the current one?

Maybe the errors have started recently - have there been any recent system changes that may have allowed users to capture faulty records?

 Similarly, User information gives you context

Can you track patterns of behavior to specific users or teams?

Users will develop certain patterns of behavior, or work around, in order to bypass system restrictions where these are considered to be onerous, or where they do not allow the task to be performed.

For example, a system may require a Client Account ID to be captured before allowing a call to be completed. If the client does not know, or will not share, this information the call center agent, under pressure to complete the call timorously, may capture another Client's ID instead.

Patterns in behavior by specific users, or groups of users, are a key indicator of a broken business process.

Further investigation will need to be done by the data stewards.

Maybe the problem can be tied back to overly ambitious system validations?

Do the users need training or additional support? In many cases, these errors can be solved by education.

Do your user's KPIs need adjustment? Many data quality errors are caused because users are measured on volume of data captured rather than on quality of data captured.

Quite possibly there will be a combination of some or all of these factors.

Designing with data quality in mind means giving context to errors! You may want to add additional information to your systems.

2. Use a "soft" delete / merge

Another issue we may uncover in your information is that of so-called "orphan records" - records that have lost their partner.

Two simple examples - a delivery note that does not have a delivery address, or an order that does not have a customer.

In some cases, these records are simply captured incorrectly - the user accidentally types in a non-existent customer number.

In this case, you can do root cause analysis as per point 1.

However, in many cases this issue is caused by one of the records being deleted after the event. Your user linked the order to an existing customer and, later, another user deleted the customer record.

Deletion and merging are important tools for managing data integrity. If you want to reduce faulty or duplicate records you must give users the tools to sort out these issues.

A deletion is used when a record is no longer relevant. There can be a number of good business reasons to delete a record
- for example, a legal requirement to cease doing business with a particular client. A so-called soft delete provides you with a means to treat the record as deleted, without losing any information.

A soft delete means that, instead of physically removing the record from the underlying database, the record is marked as deleted. This means that users will not be able to access or use that record, but that it will still be available for audit purposes.

A merge is used when you identify that two or more records exist for the same entity. This is an extremely common problem, most efficiently picked up through the use of automated data cleansing and matching tools.

For example, the supplier records for "Mr J Bloggs, CDO at Widgets Co" and "Joseph P. Bloggs, Chief Data Officer, Widgets Company Inc." represent the same supplier.

In order to clean up our system we need to merge these records to create a single, unified supplier records.

A soft merge would link both records via a common key, allowing us to maintain the integrity of all linking transactions, before soft deleting all but one of the set.

Your system should be designed to facilitate soft deletes and soft merges.

Plan to allow the addition of linking keys to group similar or related records, as well as for the use of a soft delete.

When used in combination with a data quality metric program these simple tips provide a solid foundation to solve most data quality issues.

Sunday, July 14, 2013

Data Capture Accuracy Used Effectively

A brief introduction for those who don't know what it is, data capture is a method of extracting information from forms and surveys that have been filled out by people either by hand or digitally. So if someone has filled in a survey or form, they can be scanned and captured and the extracted information used for its original purpose, for research, and using an actual data capture service can save a considerable amount of time doing the work by hand.

That's the basic gist of what data capture is whether for forms, surveys or other documents that need their data interacting and although there has been a standard use for data capture since it was picked up as a useful and popular service but it is also being used for many other reasons in 2013.

Along with general marketing and research techniques, the data capture of forms has had to change and adapt to changing attitudes throughout the world. In 2012 and into 2013 more and more business cards are being captured and their details including phone numbers, email addresses and other important are captured and put into databases so they can be used effectively in the future for a mixture of mailing lists and email marketing. 

However there's another side of business card capture where a lot of restaurants and entertainment establishments are offering a raffle for those who leave their business cards behind in a jar so that the eventual winner wins a prize and they can use all data from business card for marketing purposes. You may think it's a difficult and expensive process to capture business card data due to their different layouts and order, but it's a lot simpler than it seems especially using automated or automatic data capture.

Form data capture has also now been picked up by health organizations such as the NHS in order to survey patients and staff on their opinions on their service. As one of the most scrutinized and monitored service providers in the UK, they need to stay on the top of their game as all times and this is one way of checking up on the satisfaction of their users. However they receive thousands of these feedback forms at any given moment so can't all be input manually so they outsource in order to gather this information into easily manageable data and presentations so they can browse through the data with ease and get down to the bottom of any problems or see where they are exceeding expectations.

As such, data capture has become a service that extends beyond market research companies and marketing firms, but is now an essential tool in order to gain an insight into issues as well as for general feedback. As the businesses across the world are subject to more scrutiny and higher expectations, it's almost become a requirement to be the best you possibly can be and even more besides. Going the extra mile.

People as a whole are still more likely to fill out a paper-based form than one sent to them online or if they're given a link to go to. So whilst some things change over time such as the use of this particular service.