DotNet Support Blog
Showing posts with label SQL. Show all posts
Showing posts with label SQL. Show all posts

Just like its predecessor, SQL Server 2008 is taking its sweet time to actually ship.  However, unlike its predecessor, it won't just be a "worthwhile upgrade".  It will kick ass.

Here are the top 10 reasons why.

10.  Plug-in model for SSMS.   SSMS 2005 also had a plug-in model, but it was not published, so the few developers that braved that environment were flying blind.  Apparently for 2008, the plug-in model will be published and a thousand add-ins will bloom. 

9.  Inline variable assignment.  I often wondered why, as a language, SQL languishes behind the times.  I mean, it has barely any modern syntactic sugar.  Well, in this version, they are at least scratching the the tip of the iceberg. 

Instead of:

DECLARE @myVar int
SET @myVar = 5

you can do it in one line:

DECLARE @myVar int = 5

Sweet.

8.  C like math syntax.  SET @i += 5.  Enough said.  They finally let a C# developer on the SQL team. 

7.  Auditing.  It's a 10 dollar word for storing changes to your data for later review, debugging or in response to regulatory laws.  It's a thankless and a mundane task and no one is ever excited by the prospect of writing triggers to handle it.  SQL Server 2008 introduces automatic auditing, so we can now check one thing off our to do list.

6.  Compression.  You may think that this feature is a waste of time, but it's not what it sounds like.  The release will offer row-level and page-level compression.  The compression mostly takes place on the metadata.  For instance, page compression will store common data for affected rows in a single place. 

The metadata storage for variable length fields is going to be completely crazy: they are pushing things into bits (instead of bytes).  For instance, length of the varchar will be stored in 3 bits. 

Anyway, I don't really care about space savings - storage is cheap.  What I do care about is that the feature promised (key word here "promises") to reduce I/O and RAM utilization, while increasing CPU utilization.  Every single performance problem I ever dealt with had to do with I/O overloading.  Will see how this plays out.  I am skeptical until I see some real world production benchmarks.

5.  Filtered Indexes.  This is another feature that sounds great - will have to see how it plays out.  Anyway, it allows you to create an index while specifying what rows are not to be in the index.  For example, index all rows where Status != null.  Theoretically, it'll get rid of all the dead weight in the index, allowing for faster queries. 

4.  Resource governor.  All I can say is FINALLY.  Sybase has had it since version 12 (that's last millennium, people).  Basically it allows the DBA to specify how much resources (e.g. CPU/RAM) each user is entitled to.  At the very least, it'll prevent people, with sparse SQL knowledge from shooting off a query with a Cartesian product and bringing down the box.

Actually Sybase is still ahead of MS on this feature.  Its ASE server allows you to prioritize one user over another - a feature that I found immensely useful.

3.  Plan freezing.  This is a solution to my personal pet peeve. Sometimes SQL Server decides to change its plan on you (in response to data changes, etc...).  If you've achieved your optimal query plan, now you can stick with it.  Yeah, I know, hints are evil, but there are situations when you want to take a hammer to SQL Server - well, this is the chill pill.

2.  Processing of delimited strings.   This is awesome and I could have used this feature...well, always.  Currently, we pass in delimited strings in the following manner:

exec sp_MySproc 'murphy,35;galen,31;samuels,27;colton,42'

Then the stored proc needs to parse the string into a usable form - a mindless task.

In 2008, Microsoft introduced Table Value Parameters (TVP). 

CREATE TYPE PeepsType AS TABLE (Name varchar(20), Age int)
DECLARE @myPeeps PeepsType
INSERT @myPeeps SELECT 'murphy', 35
INSERT @myPeeps SELECT 'galen', 31
INSERT @myPeeps SELECT 'samuels', 27
INSERT @myPeeps SELECT 'colton', 42

exec sp_MySproc2 @myPeeps

And the sproc would look like this:

CREATE PROCEDURE sp_MySproc2(@myPeeps PeepsType READONLY) ...

The advantage here is that you can treat the Table Type as a regular table, use it in joins, etc.  Say goodbye to all those string parsing routines.

1. Intellisense in the SQL Server Management Studio (SSMS).  This has been previously possible in SQL Server 2000 and 2005 with Intellisense use of 3rd party add-ins like SQL Prompt ($195).  But these tools are a horrible hack at best (e.g. they hook into the editor window and try to interpret what the application is doing). 

Built-in intellisense is huge - it means new people can easily learn the database schema as they go.

There are a ton of other great features - most of them small, but hugely useful.  There is a lot of polishing all over the place, like server resource monitoring right in SSMS, a la Vista. 

I'd love to finish this entry on a happy note, but I can't, because I just finished perusing Editions of SQL Server 2008 page.  In addition to the Standard, Enterprise, Developer and Express editions, there are now Workgroup, Web, Compact (which has nothing to do with SQL Server) and Express Advanced editions.  Here is the comparison matrix.  And you thought picking a version of Vista was complicated.

 

You have only $50 left and you can buy two DVDs or one SQL book, what do you do? I would buy the book but not every person has the same idea of a fun time. This is the reason why I present you with a bunch of links to articles which will give you very good info. some of this you won’t be able to find in a book anyway.

The curse and blessings of dynamic SQL. How you use dynamic SQL, when you should - and when you should not.

Arrays and Lists in SQL Server. Several methods on how to pass an array of values from a client to SQL Server, and performance data about the methods. Two versions are available, one for SQL 2005 and one for SQL 2000 and earlier.

Implementing Error Handling with Stored Procedures and Error Handling in SQL Server – a Background. Two articles on error handling in SQL Server.

The ultimate guide to the datetime datatypes
The purpose of this article is to explain how the datetime datatypes work in SQL Server, including common pitfalls and general recommendations.

Stored procedure recompiles and SET options
Using stored procedures is generally considered a good thing. One advantage of stored procedures is that they are precompiled. This means that at execution time, SQL Server will fetch the precompiled procedure plan from cache memory (if exists) and execute it. This is generally faster than optimizing and compiling the code for each execution. However, under some circumstances, a procedure needs to be recompiled during execution.

Do You Know How Between Works With Dates?
article explaining why it can be dangerous to use between with datetime data types

How Are Dates Stored Internally In SQL Server?
Article explaining how datetimes are actually stored internally

Three part deadlock troubleshooting post, a must read if you want to understand how to resolve deadlocks.
Deadlock Troubleshooting, Part 1
Deadlock Troubleshooting, Part 2
Deadlock Troubleshooting, Part 3

SQL Server 2005 Whitepapers List
A list of 29 different SQL Server 2005 Whitepapers

Keep a check on your IDENTITY columns in SQL ServerThis article shows you how to keep an eye on your IDENTITY columns and find out before they run out of values, and fail with an arithmetic overflow error.

Character replacements in T-SQL
Quite often SQL programmers are left with the dirty job of working with badly formatted strings mostly generated from external sources. Typical examples are badly structured date values, social security numbers with misplaced hyphens, badly formatted phone numbers etc. When the data set if small, in many cases, one can easily fix by a one time cleanup code snippet, but for larger sets one will need more generalized routines.

 

SQL Injection and how to avoid it

Posted In: , . By Sothy Mom

It isn't as big of a deal at the moment, but it is always good to make sure everyone is aware of this and how dangerous it can be.  There is some very good information on it located on MSDN here.  The important part is to remember that anytime you take input from an external source (someone typing on a web page), they don't always have to put in what you expect.

The safest way to keep yourself safe from SQL Injection is to always use stored procedures to accept input from user-input variables.  It is really simple to do this, for example, this is how you don't want to code things:

var Shipcity;
ShipCity = Request.form ("ShipCity");
var sql = "select * from OrdersTable where ShipCity = '" + ShipCity + "'";

This allows someone to use SQL Injection to gain access to your database.  For example, imagine if someone put in the following for the "ShipCity":

Redmond'; drop table OrdersTable-- This would delete the entire table!  If you have seen much on SQL Injection, they have figured out all kinds of ways to get information about your database or server, so don't think they can't find the names of tables, etc.

The correct way to do this would be using a stored procedure as follows:

SqlDataAdapter myCommand = new SqlDataAdapter("AuthorLogin", conn);
myCommand.SelectCommand.CommandType = CommandType.StoredProcedure;
SqlParameter parm = myCommand.SelectCommand.Parameters.Add("@au_id",
     SqlDbType.VarChar, 11);
parm.Value = Login.Text;

Then you will be protected.  Be sure to use parameterized stored procedures to keep the stored procedure from having the same problem as before.|

There are other hints and advice on the MSDN article that you can check out, but this is the major piece of advice to know.

There is also some additional information that you can find here.  You can find more information and a video at Explained – SQL Injection and another video about it here.  There are tons of links on the web so feel free to research this more to be sure you are safe from this problem.

 

Do you use database projects in Visual Studio? If not, then now is a good time to start. It is the best way I have found to source control my databases without actually sticking the database file itself in the repository. It is very simple to add to your solution and creates a default directory structure for your create scripts, change scripts and queries. Best of all the project will be recognized by source control and added to the source code repository. This means you can keep versioned scripts for maintainability. It also gives you the option to do the dreaded rollback!

Lets start by creating a database project of our own. To start, we will create a blank Visual Studio solution.

screenshot-2008-05-05_18_56_00

Now click File -> Add -> New Project.

screenshot-2008-05-05_18_57_08

You should get the dialog below. Now we will look in the node of "Other Project Types", click "Database" then select "Database Project". This will give us the opportunity to name our database project. We will call this one by the database we are going to be using, good ol' Northwind.

screenshot-2008-05-05_18_58_44

Next you will be asked to setup a database reference. So here all we have to do is add a new reference to the Northwind database.

screenshot-2008-05-05_18_59_29

My database will be in SQL Server, yours can be in any of the choices shown below.

screenshot-2008-05-05_19_01_31

Now we just set the server name and choose the Northwind database.

screenshot-2008-05-05_19_02_07

You should then test your connection (out of good practice) and see the dialog below upon success.

screenshot-2008-05-05_19_02_15

Now we will see our database reference shown in the original list of references. Just double click on your newly added reference.

screenshot-2008-05-05_19_02_31

Now your project should look something like the shot below.

screenshot-2008-05-05_19_02_51

Once we get this far, we need to start filling our folders with database scripts. You can add new folders if you wish, but I think the default folder structure is good enough. Now then, we can use the database publishing wizard to get our first create script into the "Create Scripts" folder.

Double click the reference to your database in the database references and it should bounce you to the server explorer window with your data connection all setup.

screenshot-2008-05-05_19_05_20

Right click your data connection and click "Publish to provider..." to start the database publishing wizard.

screenshot-2008-05-05_19_05_31

Now we select the database we would like to publish, so again I will choose Northwind.

screenshot-2008-05-05_19_05_52

Now with the "Script to file" option chosen, we will browse to the "Create Scripts" folder in our database project to put the outputted .sql file.

screenshot-2008-05-05_19_06_51

Now we get to choose some options. I am going to set the output for SQL Server 2005 and select to script the database schema only.

screenshot-2008-05-05_19_07_18

Now we can review the summary of what we have configured and click finish.

screenshot-2008-05-05_19_07_26

Upon clicking finish, we will see the progress of the .sql file being scripted to the hard drive.

screenshot-2008-05-05_19_08_07

Ok, for some reason I couldn't find a refresh button in the database project to see the file that was created so we will right click the "Create Scripts" folder and click "Add Existing Item..."

screenshot-2008-05-05_19_10_15

Then browse to our .sql file and click the "Add" button.

screenshot-2008-05-05_19_10_27

Now we should see it in our project and we can also see the contents of the generated script by opening it in the editor.

screenshot-2008-05-05_19_12_20

screenshot-2008-05-05_19_12_31

It really is just that simple. I mean if nothing else you have a designated place to put all your SQL scripts for your database. The big plus is that your scripts will be source controlled. Also, anyone who opens the solution will know exactly which databases are used by the project.