Rules to Better SQL Databases - Admin
Here are some of the typical things that all SQL Server DBAs and Database developers should know. These rules are above and beyond the most basic textbook recommendations of:
- Ensuring your databases are Normalized and in 3rd Normal Form
- Making sure you have primary keys, foreign keys and simple indexes to improve performance
- Making sure you Back up regularly
- Basic Naming conventions (see some of our object naming conventions)
- Minimizing resultset sizes and data over the wire
If you measure up-time you can pro-actively inform your manager how successful you have been as a DBA. You can do this in 2 ways:
Option 1: High Tech Solution - using System Center Operations Manager (SCOM)
SCOM allows you to monitor and generate reports on the total uptime of your SQL Server and other service level exceptions. You need the following for these reports:
- System Center Operations Manager and SQL Server on the network when performing a network scan
- Microsoft System Center Management Pack for SQL Server
Option 2: Low Tech Solution - using a recurring select as a heartbeat
- Run a query as a ping once every 5 minutes something that takes about 2 seconds
- SELECT * FROM Orders Five times
- Log it with the time
- Graph - See uptime
- Graph - See performance
Most patches are for security. SQL Slammer showed that it's no good waiting a month before you decide to install a service pack. I would say wait one week and then install the service pack (maximum delay should be 2 weeks)
- Manually check your server using @@version
Many developers are frightened of making a change to the existing database because they just don't know what applications are using it. This is especially a problem when you are dealing with databases that you did not create. Here are some approaches to this issue:
- You could run around the office and find some one and hope they know (unbelievably this seems this the most common method!)
- Trawl through source control, all network locations and all the source code around to check what connection strings are being used
- You can have a zsApplication table and manually populate with application it uses (Recommended). This can be populated with a run of a SQL profiler over a period of a week so all usage is captured.
- Keep a constantly running login Audit with a SQL Server Profiler Trace that saves to a table and make sure all applications have an application name in their connection string. This method is the most comprehensive option but is not recommended because you get a constant performance hit from SQL Profiler running. Figure: SQL Profiler can help you design for change with auditing of Login events by giving you a guide on what applications are connecting to your database
Every time a change is made to your product's SQL Server Database, script out the change. You can use Enterprise Manager, VS.NET or Query Analyzer but every time you make changes you must save the change as a .sql script file so any alterations are scripted. Everything at SSW is usually done three times, once on Development, once on Staging and once on Production. Change control is one of the most important processes in ensuring a stable database system.
Keep the scripts in a separate directory to any other scripts or files. This way you can always go back to them and find out what alterations you have made to the database in version xxx to find errors. If you have all the scripts you are able to rebuild the database from scratch. At SSW we name this folder SQLChangeScripts so as to not confuse it with other script folders.
The script file format should be: <version>_<description>.sql
The <version> should be a number which is padded with leading zeros (0) on the right to firm 3 or 4 digits (however long we need).
Depending on which components you decide to install on your SQL Server, you may need to configure the following services:
- SQL Server
- SQL Server Agent
- SQL Server Reporting Services
- SQL Server Integration Services
- SQL Server Fulltext search
- SQL Server Analysis Services
In the service properties window for these services, ensure that the Service Startup Account is run as "This Account" and not as "Built-in Account". Otherwise, you won't get all the functionality by default such as the ability to use Replication, Linked Servers or connect to other machines.
For security, you should not have this domain account in the Administrators group.
When you create the domain accounts under which SQL Server services run, you should name this domain account as "SQLServerMachineName".
If one of the SQL Server services updates a file on the network, then you can then determine which server wrote to the file.
You should always run all SQL Server services with the lowest possible privileges allowed in case the account is compromised. SQL Server setup makes the whole process of granting privileges a whole lot easier because it automatically creates groups with all the necessary permissions for you!
If you are running any SQL Server Service in a user account that has administrator privileges a user that compromises the account could do anything that administrator could do - including playing around with the registry with procedures like xp_regdeletevalue. So, if you use an Administrator account, you're in effect giving away the keys to the house. Is this something you want to do?
SQL Server includes Database Mail (it was a new feature released back in 2005 as a replacement for SQL Mail). Database Mail is a great feature as it allows:
- HTML messages natively supported - so there's no need to use 3rd party dlls anymore
- Communication direct with SMTP server - There's no need for outlook or MAPI profiles on server
- Multiple profiles and accounts supported to specify multiple SMTP servers or different email infrastructure situations
- SQL Server queues messages even when the external mailing process fails
- High security - users and roles have to be granted permission to send mail
- Logging and auditing
- Attachment size regulations and file extension requirements
EXEC master.dbo.xp_smtp_sendmail @FROM = Nfirstname.lastname@example.org', @FROM_NAME = N'Sophie Belle', @TO = 'email@example.com', @subject = 'Vendor List', @message = 'The list of vendors is attached.', @type = N'text/html', @server = N'mail.company.com.au'
Figure: Bad example - Avoid using SQL Mail - you need to have Outlook on the server and there is no built-in logging
USE msdb Execute dbo.sp_send_dbmail @profile_name = 'UTS', @recipients = 'firstname.lastname@example.org, @body = 'The list of vendors is attached.', @query = 'USE AdventureWorks; SELECT VendorID, Name FROM Purchasing.Vendor', @subject = 'Vendor List', @attach_query_result_as_file = 1
Figure: Good example - Use database mail for scalability, built-in logging and HTML capability
SQL Alerts are valuable because they can alert administrators of imminent SQL Server failures. e.g. when the msdb log file is full. To enable, you should change the settings under SQL Server Agent.
SQL has no default alerts. You will have to create them, and I recommend that you add all the fatal level exceptions to alerts.
In addition don't forget to add your own alerts, such as sending an alert for the exception "is outstanding amount > $15000". Many people are not aware of this functionality and write triggers or get 3rd party products to the same job.
It is surprising how many IT staff create backup plans and then leave it at that. To have a complete maintenance plan, you should also consider the following:
- Checking database integrity
- Shrinking Databases
- Reorganizing Indexes
- Rebuilding Indexes
- Updating Statistics
- Cleaning up old maintenance histories
- Performing automatic backups
- Backing up System databases
- Last but not least - you should regularly check that the maintenance plans have been running successfully. Otherwise, all your backup and maintenance efforts are pointless
This can be found under Management | Maintenance Plans within the database tee in SQL Server.
Scripts are an important component in the operation of any database. This is why you should back up all your scripts and historical schema snapshots - so you can track the scripts that have been run and those that need to be deployed to test and production databases. We typically store these in source control such as VSS or Team Foundation Server as a Visual Studio Database project. You should regularly generate full scripts of all objects changed, keeping the following points in mind:
- Don't encrypt your database objects if you can avoid it - otherwise, they can't be scripted.
* Enterprise Manager Generate Scripts Wizard OR * SQL DMO object model to script out the objects OR
Restoration is vital, we all agree. But rapid restoration with minimal downtime is just as important. Run practice restorations on a regular basis, as you don't want to find out that your back-up doesn't restore when a problem has already occurred.This restoration process is so important that you should have a termination clause in an employee's contract if the restore doesn't work when something goes wrong.
If the restoration process is not clear and too complicated, then it will cost you both time and money when you are already in a pressure situation. Make sure you have a step by step and comprehensive restore standard with screenshots for every step of the way so there are no nasty little surprises when your system goes down.
SQL Server stores vital error and performance information in several different logs. You should be aware of all of them:
SQL Server Error Logs
- Configure how many you want to keep
- You should Back up your SQL Server error logs with your other scripts
SQL Server Agent Error Log
- Recycles after every service restart
Job History Logs
- Agent properties, Job System tab
- Probably too low by default
- DBMaint history logs
- (Event Viewer) - Issues
- Disable Administrator and Rename it, then create a new "honeypot" Administrator account with no permissions.
- Disable Guest on the SQL
- Change Port 1433
- Delete the sample databases - (AdventureWorks, Northwind and Pubs). These have a Public Role which is a security risk and allow Massive SQL Statements
Other security issues
- Use a service account with a strong password
- Do not run SQL Server service as an administrator
- Run in integrated security mode
- Run on NTFS file system - Encrypt the data files
Configure login security auditing:
- Not on by default
- Configure on the security tab of Server Properties in SQL Server Management Studio
- Enable for Failure
- View using the Windows Event Viewer
Note: You can turn on a trace for SQL DDL operations statements.
Change the defaults from 20480KB to 64000KB and Overwrite as needed. This will allow the users to view Security audits and errors much further into the past with a minimal increase in space - and it will never bloat your server.
What goes for permissions to Windows objects also goes for SQL Server objects. Remove all permissions but the bare minimum required to operate your application.
Performance alerts work well for problems that need to be discovered before they occur.
For example, one problem that you may encounter is database file growth. Since databases are set to grow to a certain percentage, you needed to configure an alert to let you know when my database would draw close to that threshold. you can configure a performance alert that fired off when it reached 80% of that threshold. Here is an example of what you can do:
To configure an alert to trigger an application, perform the following steps:
- Start the Microsoft Management Console (MMC) Performance snap-in (Start, Programs, Administrative Tools, Performance).
- Expand Performance Logs and Alerts, and select Alerts.
- Right-click in the right pane, and select New Alert Settings.
- Enter a name for the setting that reflects what the alert will monitor, and click OK.
- On the General tab, add the counter the alert will monitor and specify the values that will trigger the action.
- On the Action tab, select the Run this Program checkbox.
- Click the Browse button, and select the name of the application you want to run.
- Click OK.
You have just configured an application to run in response to an alert. Unfortunately, because the program doesn't interact with the desktop, it runs in the background, visible only in Task Manager. To enable the program to run interactively, perform the following steps:
- Start the MMC Services snap-in (Start, Programs, Administrative Tools, Services).
- Right-click Performance Logs and Alerts, and select Properties.
- On the Log On tab, specify the "Local System account" and select the "Allow service to interact with desktop" checkbox.
Collation is the combination of language and sort orders, and you typically don't notice it until you start running cross-database queries.
It would make development simpler if the whole world spoke one language but even if you are using English, you will still encounter collation issues if you are not careful. The most common issue is the dreaded 'Cannot resolve collation conflict for equal to operation' error when joining on columns that have different collation orders. Collation is a great feature for international companies, but if you are not consciously using it then you should have ALL the objects in ALL the databases on ALL the servers using a consistent collation.
Flexibility with collation orders has increased a lot since SQL 7.0:
- SQL 7: Back in SQL Server 7, you could only define the collation at the server level and, once it was set, you could not change it without rebuilding the master database.
- SQL 2000: This added the ability to have Column level collation which allows you to set it at the database or column level.
However, with this column-level flexibility come additional issues. It is ideal for those who only want the column name 'FirstName' to be represented in accent insensitive sort order. However, one of the side effects, if you are not taking notice of collation, is that you end up with many different collations on many different databases.
We feel that the only time you need inconsistent collations is when you have a rogue 3rd Party application like Microsoft Great Plains that enforces its own collation.
See these Knowledge Base articles for more information about the issues you will encounter when you have inconsistent collations:
- Q211874 - Why do I get the error 'Cannot resolve collation conflict for equal to operation'? The database collation differs from the SQL Server default collation because it was attached or created with a different collation order. This causes issues when you attempt to join tables in databases that have different collation orders. For example, if your tempdb database and Northwind each have a different collation you will get the following error 'Cannot resolve collation conflict for equal to operation' when you attempt to do a join between tables from these databases
- Q711843 - How do I change the collation order in my SQL Server 2000 or 7.0 database? There is no 'recommended' collation as different collations will be used in different countries but as a guideline, installations in the United States and installations that require compatibility with SQL Server 7 databases should use the SQL_Latin1_General_Cp1_CI_AS collation. Non-United States installations in English speaking countries should use the Latin1_General_CI_AS collation
You don't want this error:
"120_ClientInvoice_ClientIDRequired.sql...Column 'dbo.Client.ClientID' is not of same collation as referencing column 'ClientInvoice.ClientID' in foreig..."
When you write a stored proc - it must work regardless of the users collation. When you are joining to a temp table - meaning you are joining 2 different databases (eg. Northwind and TempDB) they won't always have the same collation.
The reality is that you can't tell a user what collation to run their TempDB - we can only specify the collation Northwind should be (we don't even want to specify that - we want that to be their default (as per their server)).
Here is what you need to do:
SELECT #ClientSummary.ClientID, DateOfLastReminder = MAX(ClientDiary.DateCreated), DaysSinceLastReminder = DATEDIFF(day,MAX(ClientDiary.DateCreated),getdate()) INTO #RecentReminderList FROM ClientDiary INNER JOIN #ClientSummary ON ClientDiary.ClientID = #ClientSummary.ClientID COLLATE database_default WHERE ClientDiary.CategoryID LIKE 'DEBT-%' GROUP BY #ClientSummary.ClientID
When trying to create a database in SQL Server 2005 from an existing create script written for SQL Server 2000, we came across a problem. Our create script was trying to determine the path to save the database file (the path to the default data store) by using the sysdevices table in the Master database; however, the schema for the Master database had changed in 2005 and our script could no longer find the column it relied on to determine this path.
Rather than creating a new script specific to 2005, we found that by removing the optional FILENAME attribute altogether, both SQL Server 2000 and 2005 were happy and the database files were saved into the default data directory which is what we were after.
The moral of the story is - keep it simple.
When using a create script to create a new database, let SQL Server determine the filename and path from its default settings. This will help make the script simpler, more flexible, and ready to use with utilities such as MS OSQL and SSW SQL Deploy.
DECLARE @device_directory NVARCHAR(520) SELECT @device_directory = SUBSTRING(phyname, 1, CHARINDEX(N'master.mdf', LOWER(phyname)) - 1) FROM master.dbo.sysdevices WHERE (name = N'master') EXECUTE (N' CREATE DATABASE [DatabaseName] ON PRIMARY ( NAME = N''[DatabaseName]'', FILENAME = N''' + @device_directory + N'[DatabaseName].mdf'' ) LOG ON ( NAME = N''[DatabaseName]_log'', FILENAME = N''' + @device_directory + N'[DatabaseName].ldf'' ) COLLATE SQL_Latin1_General_CP1_CI_AS ' ) Go
Figure: Bad Example - FILENAME Parameter used to specify database path
CREATE DATABASE [DatabaseName] COLLATE SQL_Latin1_General_CP1_CI_AS Go
Figure: Good Example - Generic CREATE DATABASE used
The SQL 2005 generated scripts are not compatible with SQL 2000, so use SQL 2000 to generate your scripts if you want to make your scripts work well on both versions.
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[ProcessTarget]') AND type in (N'P', N'PC')) drop procedure [dbo].[ProcessTarget]
**Figure: script only works on SQL 2005, because 'sys.objects' is only available in this version**
IF EXISTS (SELECT * FROM dbo.sysobjects WHERE id = OBJECT_ID(N'[dbo].[ProcessTarget]') AND OBJECTPROPERTY(id, N'IsProcedure') = 1) drop procedure [dbo].[ProcessTarget]
**Figure: script works on both SQL 2000 and SQL 2005**
When you are tuning SQL statements you tend to play in SQL management studio for a while. During this time SQL caches your query's and execution plans.
All well and good but when you are trying to speed up an existing query that is taking some time then you may not be making a difference even though your execution times are way down.
You really need to clear SQL's cache (or buffer) every time you test the speed of a query. This prevents the data and/or execution plans from being cached, thus corrupting the next test.
To clear SQL Server's cache, run DBCC DROPCLEANBUFFERS , which clears all data from the cache. Then run DBCC FREEPROCCACHE , which clears the stored procedure cache.