Here’s our second blog from one of our dsp techies…
Two months passed and the SQL Server Security Auditing nears completion, just a few workshop-type educational sessions and documentation to complete – the perfect time to write Part II of the SQL Server Auditing blog.
To recap, we were cornered – SQL Audit, Policy Based Management, C2 Auditing, Extended Events … All the bells and whistles, all the good stuff usSQL Server DBAs know and love (well, the majority of the time) could not meet the demands of our client’s sprawled estate. The challenge was to provide an auditing solution for all versions, all editions and meet the demands of the project: three months, no cash, no existing security – oh, and can you do it with Oracle ?
A security framework document was written to define new security standards and ultimately define the project. Laid out exactly the same as the Oracle section of the same document, everything was built on and around internal policies, existing build standards, industry standards, and external audit requirements. These all fell into the following categories:
Accounts: Built-in \ Privileged & Service
Some obvious, some default, some enforced, the framework covered all bases; from SQL Server log file number and retention, to detection of blank and weak passwords. Disabling TELNET, FTP, NNTP services, binary lockdown, ports, patching, service account interactive login rights, default database context … More of the good stuff us DBAs know and love, and the better part of one hundred different security controls had to be defined.
It was a start. Code hardening, encryption, certificates, penetration testing and many other good security practices simply did not fit into the project time frame and indeed, are individual projects in their own right. But we had the 101 of SQL Server Security covered. Time to measure up.
A linked server infrastructure had already been established from a DBA ‘admin’ instance, used for health check reporting by the BAU team, which relied upon a SQL Server inventory corresponding to the linked servers. No time to re-invent the wheel, both the inventory and the linked servers could be used for auditing. There were reservations about linked servers from a security perspective, but they were there to stay.
And so one stored procedure was written per audit control which were divided into control checks or control reports. The control checks were designed to indicate a pass or a fail, e.g. has your sa account been renamed, yes or no ? The control reports fed back data, e.g. which logins had blank passwords, or who has local admin rights. All information was collected into defined data collection tables for later analysis.
Nearly all stored procedures work on the same template, which can be used again and again for additional controls added to the framework at a later date. They all build dynamic SQL to execute distributed queries to pull the audit data back to data collection tables, wrapped around a TRY CATCH block for error handling – all errors caught with details (date, sproc, instance etc.), written out to an error handling table.
The design was to have a single stored procedure to cycle through the instance inventory table and then the check controls (also table entries) to make nested calls to categories of stored procedures (by control type) at different levels, matching the framework document categories – with so many controls, there was a clear need to divide everything up just to manage it. This also leads the pathway to test individual control items later or run by group of control items per instance or group of instances; no need to run an entire audit every time.
Distributed queries over linked servers ? Security alarm bells should be ringing, and they were. Initial reservations remained of course. But the need was to pull this information remotely and it was established infrastructure on the client site, there to stay. Needless to say it was ensured the default self-mapping for current security creds; now add to the task list ‘Encrypting SQL Server Connections’ in the SQL Server 2012 roadmap (SSL, certs, IPSec …) – fortunately the next project as luck would have it.
Reporting and remediation then, a different kind of R&R. We needed to be able to report on our audit information and also to fix stuff that was not adhering to the all-important new security framework.
Since remediation can break so much we are always very much in danger of fixing security only to break something else, particularly our apps – even down to renaming our sa account – how many times have we all seen this in use ? (Hands up, I’m guilty too). Yes, it should be disabled and renamed, but quite often it is not. …Just one small example.
Automating security fixes is dangerous. And as such, a build script was written to ensure all new builds are compliant. We already had control tables for the check \ report audit controls, the SQL Server inventory table, the audit data collection tables – join them all together into a nice audit failure report in SSRS with a single query. Now our new auditing solution is in place as a mechanism to provide a snapshot of the estate, comparable to our framework document and also any new build.
Theory would have it the estate would be compliant over time if new builds are compliant going forward, with less emphasis on looking over one’s shoulder. That said, an impact assessment was implemented. Simply LOW, MED, HIGH, for all security controls recommended to be enforced. Anything that was subject to pain would be registered as a security exception for future review \ replacement by new builds. The emphasis was on recommendation to business users rather than enforcement.
With a single procedure to run an entire audit of the SQL Server estate, which can be scheduled by windows task or SQL Agent job. It’s scalable and portable and can be free if set-up on SQL Express. With the reports we have something tangible to look at and review in relation to a defined security framework, a solid foundation to work from and build on.
For more information or to schedule a demo please contact usContact us
DSP firmly established itself as one of the UK’s fastest growing proactive Database MSPs during 2016, signing £2.1m of new contracts and making its second acquisition, the Oracle DBA support division of IT Services provider ITSB.Read more
A main goal... how to future proof your applications environment through Oracle while becoming more efficient in terms of costs and productivity.
As well as the option of developing the fundamental steps you need to take to get ...Read more
In the age of big data, public cloud, private cloud it’s easy to be pushed into constantly thinking about the future... What should you optimise though and how can you be sure it’ll make a difference? Would knowing it could give you a competitive edge be worth considering?Read more