Saturday, December 30, 2017

Happy new year 2018 to all!

From SQL Guatemala, we wish you a happy new year 2018.

 We hope this new year brings you joy and zero: production outages, corrupted databases, corrupted backups, full disks, and failed nodes hehehehe!

Also we wish for you all your desired certifications, your new software licenses and tools you need, no dumb users with sa privileges, and of course... a salarial raise!

Best wishes for all...

Tuesday, December 12, 2017

SQL Vulnerability Assessment

A very interesting feature was recently added to SQL Server Management Studio 17.4 (SSMS): The new SQL Vulnerability Assessment.

What is it?

SQL Vulnerability Assessment or VA, is a lightweight, easy to use tool to identify and help to remediate potential security vulnerabilities, these rules are based on Microsoft's best practices.
This reports executes at database level.

VA is included on SSMS 17.4 and it works for SQL Server 2012 and later.

Using VA

  1. From the Management Studio,connect to the SQL Server database instance.

  2. Right click on the database to analyze and select Tasks > Vulnerability Assessment > Scan for vulnerabilities...

  3. Select the path to store the report and click OK.

  4. After a moment, the report is show.
    You can now see all the vulnerabilities encountered with a brief explanation and a suggested fix for each one:
    Failed rules

    Passed rules



You can also approve results to be marked as passed using the approve as baseline option, so the next time you run a scan it now marked as pass:

Just select the rule to mark ass approved:

 A warning will now pop up:

 Next time you run a scan, the rule is marked as pass:


Friday, November 24, 2017

How to check advanced options in SQL Server?

To check the available advanced options configured at instance level in SQL Server, you need to use the  sp_configure system stored procedure.

The default behavior, for security and stability purposes, is to show only the "least" advanced parameters, so, if you execute the stored procedure, it will show something like this:

EXEC sp_configure

It doesn't show all the configuration options

To display all of the available options, you have to run the "show advanced options" command:

EXEC sp_configure 'show advanced option', '1'; 

CAUTION: Be extremely careful enabling this command, and remember to disable right after using it, because any user have access to the sp_configure command, and you will expose advanced configuration parameters to an untrusted user.

After this, you have to execute a RECONFIGURE command for the changes to take effect.

Note: RECONFIGURE and RECONFIGURE WITH OVERRIDE have the same effects for displaying the parameters, the only difference is when you change any parameter, the OVERRIDE option allows you to set a value outside allowed boundaries (yes, you should never do it, but the engine give us the option of shooting ourselves in the foot). 

To show all the advanced options we proceed to run the following command:

USE master;

EXEC sp_configure 'show advanced option', '1'; --Enable advanced options

EXEC sp_configure --Show all the options

EXEC sp_configure 'show advanced option', '0'; --Always disable advanced options

Note that not all options are displayed on the image


Monday, November 20, 2017

Finally I am MCSA

I have recently obtained the certification Microsoft Certified Solutions Associate: SQL Server 2012/2014.
I began with this path almost 2 years ago, back then i wrongly thought that the MCP will be enough for me, and then I decided to get out of my comfort zone and look forward to finish the certification to the next step, and here we are, after a long struggle and a lot of blood and tears, I made it :)

Perhaps you are wondering, "hey this is not a a big deal, so why the post?" because it was, at least for me, so freaking difficult! Let me explain why:

The old format had 3 required exams

When i started my MCSA certification path, the SQL Server 2016 certification path was still too recent, also my skills on that area, so I decided to take the 2012/2014 instead, who has 3 required exams:
  1. Querying Microsoft SQL Server 2012/2014
  2. Administering Microsoft SQL Server 2012/2014 Databases
  3. Implementing a Data Warehouse with Microsoft SQL Server 201S2/2014
And now for 2016, it only requires 2.

Querying, BI and Administration was all in the same pack

For MCSA 2016, you can choose which one you want to obtain, database administration, database development, BI development, but for 2012 (as you could see earlied) all is included in the same pack.

One of the hardest test of my life

Among all the test i have taken in my life (school, university, certifications) I consider the Querying Microsoft SQL Server one of the most difficult test I have taken (if not the most difficult). I have known people that have pass this test easily, but I'm not used to memorize things and for this test, *SPOILER ALERT* yo have to memorize a lot!

My next step

In the short future, in want to focus on 3 things:
  • Upgrading to MCSA 2016 (since the upgrade only requires 1 test)
  • Obtain MCSE: Data Management and Analytics certification (focused on DB Administration)
  • Learn about Machine Learning and maybe in a future obtain the MCSA: Machine Learning certification.

Friday, November 10, 2017

Query to determine mirrored database status

I am sharing today a SQL Script to quickly check on an instance which databases are in mirrored and their status.

from sys.databases db
inner join sys.database_mirroring dm
on db.database_id = dm.database_id
where dm.mirroring_role_desc is not null
order by
I hope you find this script useful! 

Monday, November 6, 2017

SSIS Basics: Integration Services Catalogs

In the new Project deployment model for SQL Server Integration Services, a new SQL Server database is used to provide a central management and deployment of packages: SSIS catalog

What it does?

Is an special database called SSISDB where you store projects, packages related to the projects, environments (for example dev, test, QA and Prod) and variables related to each environment (for example a connection string). It also stores the execution history for further analysis.

How to access it

First of all, you need to install it since it is not added by default.

If you don't have it installed


Open the SQL Server Management Studio, and then locate the Integration Services Catalogs folder, and then right click in the folder, and select the Create Catalog option.

After that, a new window will appear, just enter the password you want, and the catalog will be created.

If you have it already installed


Just expand the folder and the SSISDB will be available:

Also a SSISDB database is created, this can be used to perform custom queries or monitoring.

How to configure it

Now we can deploy packages and create environments for each project.

To deploy a test package


You can use SSDT IDE to deploy a package directly to the catalog if you have it configure as Project deployment model.

In the project, right click and select the Deploy option.

Select your SQL Server instance where the SSISDB catalog is installed. In the Path field, select the folder where you want to install the package.

Finish the wizard and if everything is OK, the project will be deployed to the catalog successfully.

 We can expand the catalog again, and we can see that the test package is available now.


Create environments


Another feature of the SSISDB catalog, is to create environments, were you can store different parameters and configurations for the package, so you can select to which one do you want to execute it.

As you can see, for this particular example we have created an environment to store different connection strings

So now you have the basic knowledge to get rid of maintaining your packages in the filesystem, as you can have a centralized place to store and manage them with improved security.


Friday, October 6, 2017

T-SQL Basics: MERGE Statement

One useful sentence when you need to add data to an existing table from an external source or database, and you are not sure if some of the data exists or not, is the MERGE statement.

This statement can significantly reduce the code needed to perform validations and insert data from source to destination, or updating it if already exists, even you can choose to delete data instead.
The general form is:

MERGE INTO  <target_table> 
USING <table_source>   
    ON <merge_search_condition>  
    [ WHEN MATCHED [ AND <clause_search_condition> ]  
        THEN <merge_matched> ] [ ...n ]  
    [ WHEN NOT MATCHED [ BY TARGET ] [ AND <clause_search_condition> ]  
        THEN <merge_not_matched> ]  
    [ WHEN NOT MATCHED BY SOURCE [ AND <clause_search_condition> ]  
        THEN <merge_matched> ] [ ...n ];

Let us watch an example of use, using the AdventureWorks database, I have created a test table with fewer records from the original table using simple select into:

INTO [AdventureWorks2014].[Sales].[Currencytest]
FROM [AdventureWorks2014].[Sales].[Currency]
WHERE name like '%Dollar%' 
We can see the difference in rows:

Out test target table

Also, let us change the modified date, just to show you how the update works:

UPDATE [AdventureWorks2014].[Sales].[Currencytest]
SET ModifiedDate = 'jan 01 2017' 
We can see the difference on records and dates:

In our test table, only records with the word 'Dollar' exists

Now, using the MERGE sentence, we update existing records with one date, and add the missing records with other date:

MERGE INTO [AdventureWorks2014].[Sales].[Currencytest] ct -- our test table
USING [AdventureWorks2014].[Sales].[Currency] c --source table
ON ct.CurrencyCode = c.CurrencyCode  
 ct.ModifiedDate = 'Oct 31, 2017' --the update date is Halloween
 VALUES(c.CurrencyCode,c.Name, 'Dec 25 2017') --insert date is Christmas :)
    THEN DELETE; --if you have data in the destination you want to delete it

Now take a look at the row count, now both tables match:

same number of records :)

And the data:

Sometimes is Christmas, sometimes is Halloween

Please note that the ON sentence is used as a compare method to know if the row matches or not, in most cases you will use your table primary key to perform the matching.

You can use WHEN MATCHED more than once if you use it with an AND argument.