Monday, April 28, 2008
Bind Data with dataset
SqlDataAdapter da=new SqlDataAdapter("select * from orders order by "+ str,con);
DataSet ds=new DataSet();
da.Fill(ds,"orders");
DataGrid1.DataSource=ds.Tables["orders"];
DataGrid1.DataBind();
Bind data with datareader
{
SqlConnection myConnection;
SqlCommand myCommand;
SqlDataReader myDataReader;
myConnection = new SqlConnection(ConfigurationSettings.AppSettings["strConn"]); myConnection.Open();
//prepare sql statements
myCommand = new SqlCommand("SELECT TOP 10 * FROM EMPLOYEE", myConnection);
myDataReader = myCommand.ExecuteReader();
while (myDataReader.Read())
{
Response.Write(myDataReader["fname"]);
//Spacing
Response.Write(" ");
Response.Write(myDataReader["minit"]);
//Spacing
Response.Write(" ");
Response.Write(myDataReader["lname"]);
//New Line
Response.Write("
");
}
//cleanup objects
myDataReader.Close();
myConnection.Close();
}
DATABASE Tuning
2) Connection Pooling and Object Pooling
3) Use SqlDataReader Instead of Dataset wherever it is possible
4) Keep Your Datasets Lean
5) Avoid Inefficient queries
6) Unnecessary round trips
7) Too many open connections
8) Avoid Transaction misuse
9) Avoid Over Normalized tables
10) Reduce Serialization
11) Do Not Use CommandBuilder at Run Time
12) Use Stored Procedures Whenever Possible
13) Avoid Auto-Generated Commands
14) Use Sequential Access as Often as Possible
Asp.Net Web applications Performance tuning
trace enabled="false"
2) Turn off Session State, if not required
<@%Page EnableSessionState="false"%>
3) Disable View State of a Page if possible
4) Set debug=false in web.config
5) Avoid Response.Redirect
5. A) To reduce CLR Exceptions count, Use Response.Redirect (".aspx", false) instead of response.redirect (".aspx").
6) Use the String builder to concatenate string
7) Avoid throwing exceptions
8) Use Finally Method to kill resources
9) Use Client Side Scripts for validations
10) Avoid unnecessary round trips to the server
11) Use Page.ISPostBack
12) Include Return Statements with in the Function/Method
13) Use Foreach loop instead of For loop for String Iteration
14) Avoid Unnecessary Indirection
15) Use "ArrayLists" in place of arrays
16) Always check Page.IsValid when using Validator Controls
17) Use Paging
18) Store your content by using caching
19) Use low cost authentication
20) Minimize the number of web server controls
21) Avoid using unmanaged code
22) Avoid making frequent calls across processes
23) Cleaning Up Style Sheets and Script Files
24) Design with ValueTypes
25) Minimize assemblies
26) Encode Using ASCII When You Don't Need UTF
27) Avoid Recursive Functions / Nested Loops
28) Minimize the Use of Format ()
29) Place StyleSheets into the Header
30) Put Scripts to the end of Document
31) Make JavaScript and CSS External
Tuesday, April 22, 2008
i am successfully storing images into database by converting images into bytes.
cmdSelect.Parameters.Add("@ID",SqlDbType.Int,4);
cmdSelect.Parameters["@ID"].Value=this.editID.Text;
this.sqlConnection1.Open();byte[] barrImg=(byte[])cmdSelect.ExecuteScalar();
string strfn=Convert.ToString(DateTime.Now.ToFileTime());
FileStream fs=new FileStream(strfn,FileMode.CreateNew,FileAccess.Write);
fs.Write(barrImg,0,barrImg.Length);
fs.Flush();
fs.Close();
pictureBox1.Image=Image.ఫ్రం
File(strfn);
---------------------
http://www.odetocode.com/Articles/172.aspx
---------------------
Dim myConnection As SqlConnection = New SqlConnection(ConfigurationSettings.AppSettings("ConnectionString"))
Dim myCommand As SqlCommand = New SqlCommand("sp_das_person_real_images_sel", myConnection)
myCommand.CommandType = CommandType.StoredProcedure
Try
myConnection.Open()
DG_Persons.DataSource = myCommand.ExecuteReader(CommandBehavior.CloseConnection)
DG_Persons.DataBind()
Catch
SQLexc As SqlException
Response.Write("Error occured while Generating Data. Error is " & SQLexc.ToString())
End Try
javascript
-----------------------
function hide()
{
document.form1.check.style.visibility = 'hidden';
return;
}
----------------------
Call the above function in button Onclick="hide();"
Chat Programm in ASP.NET
How to open filedialog in web applications
DisplayDownloadDialog("MyDown.doc")
----------------------------------------------------------
Sub DisplayDownloadDialog(ByVal PathVirtual As String)
Dim strPhysicalPath As String
Dim objFileInfo As System.IO.FileInfo
Try
strPhysicalPath = Server.MapPath(PathVirtual)
'exit if file does not exist
If Not System.IO.File.Exists(strPhysicalPath) Then Exit Sub
objFileInfo = New System.IO.FileInfo(strPhysicalPath)
Response.Clear()
'Add Headers to enable dialog display
Response.AddHeader("Content-Disposition", "attachment; filename=" & objFileInfo.Name)
Response.AddHeader("Content-Length", objFileInfo.Length.ToString())
Response.ContentType = "application/octet-stream"
Response.WriteFile(objFileInfo.FullName)
Catch
'on exception take no action
'you can implement differently
Finally
Response.End()
End Try
End Sub
Cursors
Dynamic cursors
Forward-only cursors
Keyset-driven cursors
Static cursors:
A static cursor always displays the result set as it was when the cursor was opened. Static cursors are always read-only.
Dynamic Cursors:
Dynamic cursors are the opposite of static cursors. Dynamic cursors reflect all changes made to the rows in their result set when scrolling through the cursor.
Forward-only Cursors:
A forward-only cursor does not support scrolling; it supports only fetching the rows serially from the start to the end of the cursor.
Keyset-driven Cursors:
The keyset is the set of the key values from all the rows that qualified for the SELECT statement at the time the cursor was opened
Wednesday, April 16, 2008
SQL KEYS
SELECT DAY(GETDATE())
How to get the version of SQL Server
SELECT @@VERSION AS 'SQL Server Version'
---------------------------------------------------------------------------
Be aware that there are often subtle syntax variations between different database systems. Also other key properties (for example 'clustered') will vary between database systems. Therefore please treat this part of the SQL crib sheet as a guide only.
Create a primary key on a table:
Alter Table TheTable Add Primary Key (field1, field2)
To add an index on a field:
alter table TableName Add Index (field1)
To remove a primary key:
alter table drop primary key
Simple SQL Statements
SELECT TOP 1 sal FROM (SELECT TOP 2 sal FROM jobs ORDER BY sal DESC)
AS E ORDER BY sal ASC
N th salary
select * from emp e where n=(select count(distinct sal) from emp where e.sal<=sal)
Select * from jobs where min_lvl in (select min_lvl from jobs group by min_lvl having count (*) > 1)
General Concepts
A constraint is a property assigned to a column or the set of columns in a table that prevents certain types of inconsistent data values from being placed in the column(s). Constraints are used to enforce the data integrity. This ensures the accuracy and reliability of the data in the database. The following categories of the data integrity exist:
Entity Integrity
Domain Integrity
Referential integrity
User-Defined IntegrityEntity Integrity ensures that there are no duplicate rows in a table.Domain Integrity enforces valid entries for a given column by restricting the type, the format, or the range of possible values.Referential integrity ensures that rows cannot be deleted, which are used by other records (for example, corresponding data values between tables will be vital).User-Defined Integrity enforces some specific business rules that do not fall into entity, domain, or referential integrity categories.Each of these categories of the data integrity can be enforced by the appropriate constraints. Microsoft SQL Server supports the following constraints:
PRIMARY KEY
UNIQUE
FOREIGN KEY
CHECK
NOT NULLA PRIMARY KEY constraint is a unique identifier for a row within a database table. Every table should have a primary key constraint to uniquely identify each row and only one primary key constraint can be created for each table. The primary key constraints are used to enforce entity integrity.
A UNIQUE constraint enforces the uniqueness of the values in a set of columns, so no duplicate values are entered. The unique key constraints are used to enforce entity integrity as the primary key constraints.A FOREIGN KEY constraint prevents any actions that would destroy link between tables with the corresponding data values. A foreign key in one table points to a primary key in another table. Foreign keys prevent actions that would leave rows with foreign key values when there are no primary keys with that value. The foreign key constraints are used to enforce referential integrity.A CHECK constraint is used to limit the values that can be placed in a column. The check constraints are used to enforce domain integrity.A NOT NULL constraint enforces that the column will not accept null values. The not null constraints are used to enforce domain integrity, as the check constraints.You can create constraints when the table is created, as part of the table definition by using the CREATE TABLE statement.
Examples
The following example creates a check_sale CHECK constraint on an employee table:
CREATE TABLE employee( EmployeeId INT NOT NULL, LName VARCHAR(30) NOT NULL, FName VARCHAR(30) NOT NULL, Address VARCHAR(100) NOT NULL, HireDate DATETIME NOT NULL, Salary MONEY NOT NULL CONSTRAINT check_sale CHECK (salary > 0))
You can add constraints to an existing table by using the ALTER TABLE statement. The following example adds a pk_employee primary key constraint on an employee table:
ALTER TABLE employee ADD CONSTRAINT pk_employee PRIMARY KEY (EmployeeId)
You can add the primary or unique key constraint into an existing table only when there are no duplicate rows in the table. You can drop constraints in an existing table by using the ALTER TABLE statement. The following example drops the pk_employee primary key constraint in the employee table:
ALTER TABLE employee DROP CONSTRAINT pk_employee
Sometimes you need to perform some actions that require the FOREIGN KEY or CHECK constraints be disabled, for example, your company do not hire foreign employees, you made the appropriate constraint, but the situation was changed and your boss need to hire the foreign employee, but only this one. In this case, you need to disable the constraint by using the ALTER TABLE statement. After these actions will be performed, you can re-enable the FOREIGN KEY and CHECK constraints by using the ALTER TABLE statement.The following example disables the check_sale constraint in the employee table and enables this constraint later:
-- disable the check_sale constraint in the employee tableALTER TABLE employee NOCHECK CONSTRAINT check_sale
-- enable the check_sale constraint in the employee tableALTER TABLE employee CHECK CONSTRAINT check_sale
Delete & Truncate
Delete Command: 1. It’s a DML Command 2. Data can be rolled back. 3. Its slower than Truncate command b’coz it logs each row deletion. 4. With delete command trigger can be fire.
Truncate Command: 1. It’s a DML Command 2. Data Can not be rolled back. 3. Its is faster than delete b’coz it does not log rows. With Truncate command trigger can not be fire. both cases only the table data is removed, not the table structure
Refelction
While there are many reasons to use reflection, let me give you a real life example. Say you are building a custom component in .NET that requires a XML file to be used along with your components DLL. You can store the XML file in the /bin directory with your DLL, but how will you access the XML file without knowing the correct path to the /bin directory? I'm sure this is a rhetorical question if you have understand even 10% of what I have wrote so far, but I'll tell you anyways: r e f l e c t i o n. I've created a sample code snippet for you that does just this. Keep reading…
Simple Mailing
2.0-system.net.mail
-----------------------------------------------
2.0
string AdminEmailId = System.Configuration.ConfigurationSettings.AppSettings["Conn"];
------------------------
string AdminEmailId = ConfigurationSettings.AppSettings["AdminEmailId"];
MailMessage strmail = new MailMessage();
strmail.To = AdminEmailId;
strmail.From = txtEmail.Text;
strmail.Body = ("
Hi,
The below are the details of person who is looking for a job.
" + (BodyString + "
Thanks."));
strmail.BodyFormat = MailFormat.Html;
strmail.Subject = "Hai";
System.Web.Mail.SmtpMail.Send(strmail);
Get Post
Incase of Post method the client input data will be carried along with form headers but not with the URL.
2. the Limitation of Get method is it cant carry the data more than 256 characters where as in
Post method there is no Limitation in the size of the data.
3. The advantage of Get method is it works faster than the post method . POST is more safer then GET. Because the URL doesn't expose the data.
4. Incase of ASP the default method is Get where as in Asp.Net is Post
How to prevent my .NET DLL to be decompiled?
The process by which you can stop this reverse engineering is using “obfuscation”. It’s a techniquewhich will foil the decompilers. There are many third parties (XenoCode, Demeanor for .NET)which provide .NET obfuscation solution. Microsoft includes one that is Dotfuscator CommunityEdition with Visual Studio.NET
Compiled vs. Interpreted
With a compiled language, code you enter is reduced to a set of machine-specific instructions before being saved as an executable file.
With interpreted languages, the code is saved in the same format that you entered.
Compiled programs generally run faster than interpreted ones because interpreted programs must be reduced to machine instructions at runtime.
However, with an interpreted language you can do things that cannot be done in a compiled language.
For example, interpreted programs can modify themselves by adding or changing functions at runtime. It is also usually easier to develop applications in an interpreted environment because you don't have to recompile your application each time you want to test a small section.
VCS
It is important to note that CVS is not a build system but rather a way to control disparate versions of code as it is developed over time.
CVS is also called a version control system.
Configure SMTP Mail Server
1) Go to IIS.
2) Go to Default SMTP Virtual Server à Right Click on this.
3) Go to Properties à Go to “Access” Tab.
4) Go to “Relay restrictions” Section à Click on Relay button.
5) Select “Only the list below” radio button à Click on Add button.
6) Select Single Computer à Give IP Address as 127.0.0.1
7) Click on OK.
Note: Please START the Default SMTP Virtual Server in IIS if it is STOP
Simple Storeprocedure
========================
CREATE PROCEDURE SP_INSERT_VALUES
(
@ProdName char(50), @active bit, @SortOrder int
)
as
INSERT INTO atmt_products (ProdName,active,SortOrder )
VALUES
(@ProdName,@active,@SortOrder)
return @@identity
GO
=================================
UPDATE:-
===========================
CREATE PROCEDURE SP_UPDATE_VALUES
(
@prodid int, @ProdName Nchar(50), @active bit, @SortOrder int)
as
UPDATE atmt_products SET ProdName=@ProdName,active=@active,SortOrder=@SortOrder
WHERE
prodid=@prodid
GO
==========================
DELETE:-
========================
CREATE PROCEDURE SP_DELETE
(
@prodid int
)
as
delete from atmt_products WHERE prodid=@prodid
GO
TRIGGERS
Unlike stored procedures, an SQL trigger cannot be directly called from an application. Instead, an SQL trigger is invoked by the database management system on the execution of a triggering insert, update, or delete operation. The definition of the SQL trigger is stored in the database management system and is invoked by the database management system, when the SQL table, that the trigger is defined on, is modified.
Implementing Triggers in SQL Server 2000
Triggers are special types of Stored Procedures that are defined to execute automatically in place of or after data modifications. They can be executed automatically on the INSERT, DELETE and UPDATE triggering actions.
There are two different types of triggers in Microsoft SQL Server 2000. They are INSTEAD OF triggers and AFTER triggers. These triggers differ from each other in terms of their purpose and when they are fired. In this article we shall discuss each type of trigger.
First of all, let's create a sample database with some tables and insert some sample data in those tables using the script below:
Create Database KDMNN
GO
USE KDMNN
GO
CREATE TABLE [dbo].[User_Details] (
[UserID] [int] NULL ,
[FName] [varchar] (50) NOT NULL ,
[MName] [varchar] (50) NULL ,
[LName] [varchar] (50) NOT NULL ,
[Email] [varchar] (50) NOT NULL
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[User_Master] (
[UserID] [int] IDENTITY (1, 1) NOT NULL ,
[UserName] [varchar] (50) NULL ,
[Password] [varchar] (50) NULL
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[User_Master] WITH NOCHECK ADD
CONSTRAINT [PK_User_Master] PRIMARY KEY CLUSTERED
(
[UserID]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[User_Details] ADD
CONSTRAINT [FK_User_Details_User_Master] FOREIGN KEY
(
[UserID]
) REFERENCES [dbo].[User_Master] (
[UserID]
)
GO
INSERT INTO USER_MASTER(USERNAME, PASSWORD)
SELECT 'Navneeth','Navneeth' UNION
SELECT 'Amol','Amol' UNION
SELECT 'Anil','Anil' UNION
SELECT 'Murthy','Murthy'
INSERT INTO USER_DETAILS(USERID, FNAME, LNAME, EMAIL)
SELECT 1,'Navneeth','Naik','navneeth@kdmnn.com' UNION
SELECT 2,'Amol','Kulkarni','amol@kdmnn.com' UNION
SELECT 3,'Anil','Bahirat','anil@kdmnn.com' UNION
SELECT 4,'Murthy','Belluri','murthy@kdmnn.com'
AFTER Triggers
The type of trigger that gets executed automatically after the statement that triggered it completes is called an AFTER trigger. An AFTER trigger is a trigger that gets executed automatically before the transaction is committed or rolled back.
Using the below script, first we shall create a trigger on the table USER_MASTER for the INSERT event of the table.
USE KDMNN
Go
CREATE TRIGGER trgInsert
ON User_Master
FOR INSERT
AS
Print ('AFTER Trigger [trgInsert] – Trigger executed !!')
GO
BEGIN TRANSACTION
DECLARE @ERR INT
INSERT INTO USER_MASTER(USERNAME, PASSWORD)
VALUES('Damerla','Damerla')
SET @ERR = @@Error
IF @ERR = 0
BEGIN
ROLLBACK TRANSACTION
PRINT 'ROLLBACK TRANSACTION'
END
ELSE
BEGIN
COMMIT TRANSACTION
PRINT 'COMMIT TRANSACTION'
END
Output
AFTER Trigger [trgInsert] – Trigger executed !!
(1 row(s) affected)
ROLLBACK TRANSACTION
By looking at the output, we can conclude that before the transaction is rolled back or committed, the AFTER trigger gets executed automatically. A table can have several AFTER triggers for each of the three triggering actions i.e., INSERT, DELETE and UPDATE. Using the below script, we shall create two triggers on the table User_Master for the INSERT triggering action.
CREATE TRIGGER trgInsert2
ON User_Master
FOR INSERT
AS
BEGIN
Print ('AFTER Trigger [trgInsert2] – Trigger executed !!')
END
GO
CREATE TRIGGER trgInsert3
ON User_Master
FOR INSERT
AS
BEGIN
Print ('AFTER Trigger [trgInsert3] – Trigger executed !!')
END
GO
BEGIN TRANSACTION
DECLARE @ERR INT
INSERT INTO USER_MASTER(USERNAME, PASSWORD)
VALUES('Damerla','Damerla')
SET @ERR = @@Error
IF @ERR = 0
BEGIN
ROLLBACK TRANSACTION
PRINT 'ROLLBACK TRANSACTION'
END
ELSE
BEGIN
COMMIT TRANSACTION
PRINT 'COMMIT TRANSACTION'
END
Output
AFTER Trigger [trgInsert] – Trigger executed !!
AFTER Trigger [trgInsert2] – Trigger executed !!
AFTER Trigger [trgInsert3] – Trigger executed !!
(1 row(s) affected)
ROLLBACK TRANSACTION
Triggers can be used in the following scenarios, such as if the database is de-normalized and requires an automated way to update redundant data contained in multiple tables, or if customized messages and complex error handling are required, or if a value in one table must be validated against a non-identical value in another table.
Triggers are a powerful tool that can be used to enforce the business rules automatically when the data is modified. Triggers can also be used to maintain the data integrity. But they are not to maintain data integrity. Triggers should be used to maintain the data integrity only if you are unable to enforce the data integrity using CONSTRAINTS, RULES and DEFAULTS. Triggers cannot be created on the temporary tables.
NORMALIZATION
Normalization is the process of designing a data model to efficiently store data in a database. The end result is that redundant data is eliminated, and only data related to the attribute is stored within the table.
For example, let's say we store City, State and ZipCode data for Customers in the same table as Other Customer data. With this approach, we keep repeating the City, State and ZipCode data for all Customers in the same area. Instead of storing the same data again and again, we could normalize the data and create a related table called City. The "City" table could then store City, State and ZipCode along with IDs that relate back to the Customer table, and we can eliminate those three columns from the Customer table and add the new ID column.
Normalization rules have been broken down into several forms. People often refer to the third normal form (3NF) when talking about database design. This is what most database designers try to achieve: In the conceptual stages, data is segmented and normalized as much as possible, but for practical purposes those segments are changed during the evolution of the data model. Various normal forms may be introduced for different parts of the data model to handle the unique situations you may face.
Whether you have heard about normalization or not, your database most likely follows some of the rules, unless all of your data is stored in one giant table. We will take a look at the first three normal forms and the rules for determining the different forms here.
Rules for First Normal Form (1NF)
Eliminate repeating groups. This table contains repeating groups of data in the Software column.
Rules for second Normal Form (2NF)
Eliminate redundant data plus 1NF. This table contains the name of the software which is redundant data.
Rules for Third Normal Form (3NF)
Eliminate columns ‘not dependent on key’ plus 1NF and 2NF. In this table, we have data that contains both data about the computer and the user.
To eliminate columns not dependent on the key, we would create the following tables. Now the data stored in the computer table is only related to the computer, and the data stored in the user table is only related to the user.
What does normalization have to do with SQL Server?
To be honest, the answer here is nothing. SQL Server, like any other RDBMS, couldn't care less whether your data model follows any of the normal forms. You could create one table and store all of your data in one table or you can create a lot of little, unrelated tables to store your data. SQL Server will support whatever you decide to do. The only limiting factor you might face is the maximum number of columns SQL Server supports for a table.
SQL Server does not force or enforce any rules that require you to create a database in any of the normal forms. You are able to mix and match any of the rules you need, but it is a good idea to try to normalize your database as much as possible when you are designing it. People tend to spend a lot of time up front creating a normalized data model, but as soon as new columns or tables need to be added, they forget about the initial effort that was devoted to creating a nice clean model.
To assist in the design of your data model, you can use the DaVinci tools that are part of SQL Server Enterprise Manager.
Advantages of normalization
1. Smaller database: By eliminating duplicate data, you will be able to reduce the overall size of the database.2. Better performance:
a. Narrow tables: Having more fine-tuned tables allows your tables to have less columns and allows you to fit more records per data page.b. Fewer indexes per table mean faster maintenance tasks such as index rebuilds.c. Only join tables that you need.
Disadvantages of normalization
1. More tables to join: By spreading out your data into more tables, you increase the need to join tables.2. Tables contain codes instead of real data: Repeated data is stored as codes rather than meaningful data. Therefore, there is always a need to go to the lookup table for the value.3. Data model is difficult to query against: The data model is optimized for applications, not for ad hoc querying.
Summary
Your data model design is both an art and a science. Balance what works best to support the application that will use the database and to store data in an efficient and structured manner. For transaction-based systems, a highly normalized database design is the way to go; it ensures consistent data throughout the entire database and that it is performing well. For reporting-based systems, a less normalized database is usually the best approach. You will eliminate the need to join a lot of tables and queries will be faster. Plus, the database will be much more user friendly for ad hoc reporting needs.