Showing posts with label control. Show all posts
Showing posts with label control. Show all posts

Wednesday, March 7, 2012

carriage return problem..

While retrieving user input from an input control, eg: multi-line textbox, and inserting it into the database, the carriage return or the 'Enter' key is not getting inserting into the database.. instead it inserts a quad ( square ) in the database.. also the text typed after the 'Enter' key is not getting inserted into the database.. please help.The carriage returnis getting inserted. The square box confirms that. When rendering it to a web page, you need to replace the carriage returns with their html equivalent - "<br />"|||I'm having the same problem, how would this be accomplished using VB?|||

VB.Net
<%# Eval("MyValue").ToString().Replace(vbcrlf,"<br />") %>

C#
<%# Eval("MyValue").ToString().Replace("\r","<br />") %>

|||

Thanks!

It works great!

I was also able to get it going with the following a few minutes ago:

<%# Eval("MyValue").Replace(Environment.NewLine, "<br />") %>

Which method do you recommend or are they both good?

|||

With the .NET framework, there are approximately 63 ways to skin most particular cats. The difference between them is most often negligible, and you should use whatever you prefer so long as your page load doesn't appear to be adversely affected. Occasionally you will get a guru tell you to use one option rather than another, because it shaves nanoseconds off the operation, and they will have benchmark tests to prove it. Personally, I think life is too short. I usually use the option that requires less typing, unless I am informed of a convincing reason to use another.

Environment.NewLine has the benefit that it can be used regardless of page language, so I shall use it in future when I answer this question without knowing the language the poster is using. Quite simply, it meets my desire to do less typing.Big Smile

Thursday, February 16, 2012

Capacity planning question

You will probably get very little, except it depends on the transaction, are
they reads, or writes? .. Do they use transaction control or not, how long
are the transactions, etc.
Other than that.
SQL loves memory.
More processors are better (Generally even if they are slower) than fewer
faster processors.
Multi-core processors are good
More on-board cache is good.
Keep your transaction logs mirrored on different drives than your data
Configure disk not only for space but for throughput - you might need more
disk heads to carry the volume, even if you have enough space with fewer
drives.
Just some general guidelines.
--
Wayne Snyder MCDBA, SQL Server MVP
Mariner, Charlotte, NC
I support the Professional Association for SQL Server ( PASS) and it''s
community of SQL Professionals.
"George Kwong" wrote:

> I am working on a RFI for a SQL Server 2000 database apllication, I am
> looking for some general answer the question below:
> Capacity and resource planning for SQL Server 2000
> 1) resource requirements for 500, 1000, 2000 concurrent users (database,
> memory, CPU, etc.)
> 2) deployment requirements for 500, 1000, 2000 concurrent users (server
> configuration, architecture model, etc.)
>
>I am working on a RFI for a SQL Server 2000 database apllication, I am
looking for some general answer the question below:
Capacity and resource planning for SQL Server 2000
1) resource requirements for 500, 1000, 2000 concurrent users (database,
memory, CPU, etc.)
2) deployment requirements for 500, 1000, 2000 concurrent users (server
configuration, architecture model, etc.)|||You will probably get very little, except it depends on the transaction, are
they reads, or writes? .. Do they use transaction control or not, how long
are the transactions, etc.
Other than that.
SQL loves memory.
More processors are better (Generally even if they are slower) than fewer
faster processors.
Multi-core processors are good
More on-board cache is good.
Keep your transaction logs mirrored on different drives than your data
Configure disk not only for space but for throughput - you might need more
disk heads to carry the volume, even if you have enough space with fewer
drives.
Just some general guidelines.
--
Wayne Snyder MCDBA, SQL Server MVP
Mariner, Charlotte, NC
I support the Professional Association for SQL Server ( PASS) and it''s
community of SQL Professionals.
"George Kwong" wrote:

> I am working on a RFI for a SQL Server 2000 database apllication, I am
> looking for some general answer the question below:
> Capacity and resource planning for SQL Server 2000
> 1) resource requirements for 500, 1000, 2000 concurrent users (database,
> memory, CPU, etc.)
> 2) deployment requirements for 500, 1000, 2000 concurrent users (server
> configuration, architecture model, etc.)
>
>|||I will add, that for an installation with those projected sizes and issues,
if you do not bring in someone with adequate experience to assist in the
design, planning, and deployment, you will be making a major mistake.
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another certification Exam
"George Kwong" <geokwo@.Lexingtontech.com> wrote in message
news:O15VKVplGHA.3816@.TK2MSFTNGP02.phx.gbl...
>I am working on a RFI for a SQL Server 2000 database apllication, I am
> looking for some general answer the question below:
> Capacity and resource planning for SQL Server 2000
> 1) resource requirements for 500, 1000, 2000 concurrent users (database,
> memory, CPU, etc.)
> 2) deployment requirements for 500, 1000, 2000 concurrent users (server
> configuration, architecture model, etc.)
>
>|||We developed the applcation under VB, we are trying to bid on a customer's
job. is there a way to do some test to find out the resource usage?
No, we use very minimum transaction controls. transaction are relative
small, we do both read and writes.
thanks.
"Wayne Snyder" <wayne.nospam.snyder@.mariner-usa.com> wrote in message
news:10F4EC48-C9AD-45E3-938B-460A95708CB2@.microsoft.com...[vbcol=seagreen]
> You will probably get very little, except it depends on the transaction,
> are
> they reads, or writes? .. Do they use transaction control or not, how long
> are the transactions, etc.
> Other than that.
> SQL loves memory.
> More processors are better (Generally even if they are slower) than fewer
> faster processors.
> Multi-core processors are good
> More on-board cache is good.
> Keep your transaction logs mirrored on different drives than your data
> Configure disk not only for space but for throughput - you might need more
> disk heads to carry the volume, even if you have enough space with fewer
> drives.
> Just some general guidelines.
> --
> Wayne Snyder MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> I support the Professional Association for SQL Server ( PASS) and it''s
> community of SQL Professionals.
>
> "George Kwong" wrote:
>|||I will add, that for an installation with those projected sizes and issues,
if you do not bring in someone with adequate experience to assist in the
design, planning, and deployment, you will be making a major mistake.
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another certification Exam
"George Kwong" <geokwo@.Lexingtontech.com> wrote in message
news:O15VKVplGHA.3816@.TK2MSFTNGP02.phx.gbl...
>I am working on a RFI for a SQL Server 2000 database apllication, I am
> looking for some general answer the question below:
> Capacity and resource planning for SQL Server 2000
> 1) resource requirements for 500, 1000, 2000 concurrent users (database,
> memory, CPU, etc.)
> 2) deployment requirements for 500, 1000, 2000 concurrent users (server
> configuration, architecture model, etc.)
>
>|||We developed the applcation under VB, we are trying to bid on a customer's
job. is there a way to do some test to find out the resource usage?
No, we use very minimum transaction controls. transaction are relative
small, we do both read and writes.
thanks.
"Wayne Snyder" <wayne.nospam.snyder@.mariner-usa.com> wrote in message
news:10F4EC48-C9AD-45E3-938B-460A95708CB2@.microsoft.com...[vbcol=seagreen]
> You will probably get very little, except it depends on the transaction,
> are
> they reads, or writes? .. Do they use transaction control or not, how long
> are the transactions, etc.
> Other than that.
> SQL loves memory.
> More processors are better (Generally even if they are slower) than fewer
> faster processors.
> Multi-core processors are good
> More on-board cache is good.
> Keep your transaction logs mirrored on different drives than your data
> Configure disk not only for space but for throughput - you might need more
> disk heads to carry the volume, even if you have enough space with fewer
> drives.
> Just some general guidelines.
> --
> Wayne Snyder MCDBA, SQL Server MVP
> Mariner, Charlotte, NC
> I support the Professional Association for SQL Server ( PASS) and it''s
> community of SQL Professionals.
>
> "George Kwong" wrote:
>|||Hi George
Are you able to benchmark other customers' installations of your application
& project the performance characteristics from those installations against
the one you're bidding on?
I'd be tracking various perfmon counters & SQL diagnostics for this,
including at least:
Perfmon:
SQLBufferManager counter object, especially Buffer Page Life Expectancy to
determine memory characteristics
CPU Utilisation - collect system wide counter & also the sqlservr process'
CPU utilisation counter
Physical & Logical disk counters - expecially disk bytes read / write p/sec
& disk queues
There are other useful counters, but these are fundamental to pulling
together an informative picture on how your existing installations are
operating under specific hardware specs.
I'd also be taking a close look at how SQL Server is using memory
internally, using dbcc memorystatus to ensure you understand how your
system's using memory.
Performing some SQL Traces might also help you to ensure your application is
well tuned, which is important when drawing benchmark conclusions.
HTH
Regards,
Greg Linwood
SQL Server MVP
"George Kwong" <geokwo@.Lexingtontech.com> wrote in message
news:uagkjdtlGHA.4512@.TK2MSFTNGP04.phx.gbl...
> We developed the applcation under VB, we are trying to bid on a customer's
> job. is there a way to do some test to find out the resource usage?
> No, we use very minimum transaction controls. transaction are relative
> small, we do both read and writes.
> thanks.
>
> "Wayne Snyder" <wayne.nospam.snyder@.mariner-usa.com> wrote in message
> news:10F4EC48-C9AD-45E3-938B-460A95708CB2@.microsoft.com...
>|||Hi George
Are you able to benchmark other customers' installations of your application
& project the performance characteristics from those installations against
the one you're bidding on?
I'd be tracking various perfmon counters & SQL diagnostics for this,
including at least:
Perfmon:
SQLBufferManager counter object, especially Buffer Page Life Expectancy to
determine memory characteristics
CPU Utilisation - collect system wide counter & also the sqlservr process'
CPU utilisation counter
Physical & Logical disk counters - expecially disk bytes read / write p/sec
& disk queues
There are other useful counters, but these are fundamental to pulling
together an informative picture on how your existing installations are
operating under specific hardware specs.
I'd also be taking a close look at how SQL Server is using memory
internally, using dbcc memorystatus to ensure you understand how your
system's using memory.
Performing some SQL Traces might also help you to ensure your application is
well tuned, which is important when drawing benchmark conclusions.
HTH
Regards,
Greg Linwood
SQL Server MVP
"George Kwong" <geokwo@.Lexingtontech.com> wrote in message
news:uagkjdtlGHA.4512@.TK2MSFTNGP04.phx.gbl...
> We developed the applcation under VB, we are trying to bid on a customer's
> job. is there a way to do some test to find out the resource usage?
> No, we use very minimum transaction controls. transaction are relative
> small, we do both read and writes.
> thanks.
>
> "Wayne Snyder" <wayne.nospam.snyder@.mariner-usa.com> wrote in message
> news:10F4EC48-C9AD-45E3-938B-460A95708CB2@.microsoft.com...
>|||"George Kwong" <geokwo@.Lexingtontech.com> wrote in message
news:uagkjdtlGHA.4512@.TK2MSFTNGP04.phx.gbl...
> We developed the applcation under VB, we are trying to bid on a customer's
> job. is there a way to do some test to find out the resource usage?
>
Yes. MS Press had a book on this for SQL 2000 and I assume there is one for
SQL 2005.

> No, we use very minimum transaction controls. transaction are relative
> small, we do both read and writes.
>
Well, fisrt pass, figure, "how many bytes will be read and written" for each
transaction.
How many transactions/sec do you need to cover?
Things like indices may greatly impact that. As will caching.
But first pass, it can give you a sense of stuff like disk I/o which is
generally the slowest part of a system.
If you're reading/writing say 100 bytes/transaction and doing 100/sec, well
you need 10,000 byte throughput on your disks.
This ain't much.
If you're diong 1,000 bytes/transaction and doing 1,000sec, well that's
another kettle of fish.

> thanks.
>
> "Wayne Snyder" <wayne.nospam.snyder@.mariner-usa.com> wrote in message
> news:10F4EC48-C9AD-45E3-938B-460A95708CB2@.microsoft.com...
long[vbcol=seagreen]
fewer[vbcol=seagreen]
more[vbcol=seagreen]
(database,[vbcol=seagreen]
>