Is easily to update – either in real time, or in monthly batches.
Permits ad hoc drill down queries and reports
Allows you to store an unlimited amount of data about any object
Permits you to add fields without redoing the whole database.
Makes it easy to modify the data and to retrieve information
Makes it easy to develop and carry out personalized communications.
Permits the inclusion of business rules in the database design.
http://www.crm2day.com/editorial/EpVkEkkylZNkorRHLx.php
Friday, December 31, 2010
DB Design consideration
at 4:16 PM 0 comments Posted by roni schuetz
Labels: Database
Thursday, December 23, 2010
How To: Sending email over Power Shell
$EmailFrom = "your@email.com"
$EmailTo = "destination@email.com"
$Subject = "Your subject goes here"
$Body = "the body info which usually will a parameter with gathered info."
$SMTPServer = "smtp.email.com"
$SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, 25)
$SMTPClient.Credentials = New-Object System.Net.NetworkCredential("user", "pwd");
$SMTPClient.Send($EmailFrom, $EmailTo, $Subject, $Body)
at 3:07 PM 0 comments Posted by roni schuetz
Labels: Power Shell
links for virtualization architectures
A great overview of tools and links for virtualization architectures
http://www.chriswolf.com/?page_id=93
at 12:27 PM 0 comments Posted by roni schuetz
Labels: virtualization
Wednesday, December 22, 2010
How To: Add Context Menu Registry
I found to today a small utility to scan and visualize my harddisc (http://www.steffengerlach.de/freeware/). in the installation folder i found the following code to add a program to the registry:
save the content in a *.reg file to execute it:
ADD:
Windows Registry Editor Version 5.00
[HKEY_CLASSES_ROOT\Directory\shell\Scan_Content]
@="Scan Content"
[HKEY_CLASSES_ROOT\Directory\shell\Scan_Content\command]
@="C:\\Program Files\\Scanner\\Scanner.exe \"%1\""
[HKEY_CLASSES_ROOT\Drive\shell\Scan_Content]
@="Scan Content"
[HKEY_CLASSES_ROOT\Drive\shell\Scan_Content\command]
@="C:\\Program Files\\Scanner\\Scanner.exe \"%1\""
REMOVE:
Windows Registry Editor Version 5.00
[-HKEY_CLASSES_ROOT\Directory\shell\Scan_Content]
[-HKEY_CLASSES_ROOT\Directory\shell\Scan_Content\command]
[-HKEY_CLASSES_ROOT\Drive\shell\Scan_Content]
[-HKEY_CLASSES_ROOT\Drive\shell\Scan_Content\command]
at 2:20 PM 0 comments Posted by roni schuetz
Labels: registry
Sunday, December 19, 2010
How To: Versioning Builds With TFS 2010
at 4:33 PM 0 comments Posted by roni schuetz
Labels: TFS 2010
how to Office Developers is called No-PIA
This sample shows how to use the C# 4.0 features called COM Interop, omitting ref, indexed properties and Named and Optional Parameters to create a C# application that communicates with Microsoft Office. C# developers have traditionally had to write relatively verbose code in order to access Microsoft Office applications such as Word or Excel. New C# 4.0 features make it much simpler to call Office APIs.
Consider this declaration for a Microsoft Office method used in this sample:
void PasteSpecial(ref object IconIndex = null, ref object Link = null,
ref object Placement = null, ref object DisplayAsIcon = null,
ref object DataType = null, ref object IconFileName = null,
ref object IconLabel = null);
As you can see, this method takes a fairly large number of parameters. In C#, developers have traditionally had to fill out each parameter, even though the developers of this call had intended to simplify its use by supporting optional parameters. In C# 4.0, the new support for named and optional parameters allows the developer to specify only the parameters of interest, and to take default values for the other parameters:
word.Selection.PasteSpecial(Link: true, DisplayAsIcon: true);
In the call to the PasteSpecial method the Link and DisplayAsIcon parameters are explicitly named, and set to the value true. All the other parameters default to values specified internally by the developers of the Office API, as shown in the above signature.
You can create your own calls that support named and optional paramters. Consider this example:
public void M(int x, int y = 5, int z = 7) { }
In this method, the parameters y and z are assigned default values. Calls to this method might look like this:
M(1, 2, 3); // ordinary call of M
M(1, 2); // omitting z – equivalent to M(1, 2, 7)
M(1); // omitting both y and z – equivalent to M(1, 5, 7)
M(1, z: 3); // passing z by name
M(x: 1, z: 3); // passing both x and z by name
M(z: 3, x: 1); // reversing the order of arguments
A new dynamic feature in C# 4.0 makes Office much easier for C# developers to use. Types used in Office are now presented to C# developers as if they were declared with the type dynamic. Here is the traditionally way to set a Cell property:
((Excel.Range)excel.Cells[1, 1]).Value2 = "ID";
In C# 4.0 developers can now write code that looks like this:
X1.Cells[1, 1].Value = "ID";
A feature called Index Properties allows us to simplify the call further, so that it looks like this:
xl.Cells[1, 1] = "ID";
A final feature of interest to Office Developers is called No-PIA. Primary Interop Assemblies are generated from COM interfaces and provide helpful type support at design time. At runtime, however, they increase the size of your program, and can cause versioning issues. The No-PIA feature allows you to continue to use PIAs at design but omit them at runtime. The C# compiler will bake the small part of the PIA that a program actually uses directly into its assembly. You will no longer need to include PIA's in the distribution of your programs.
// Copyright © Microsoft Corporation. All Rights Reserved.
// This code released under the terms of the
// Microsoft Public License (MS-PL, http://opensource.org/licenses/ms-pl.html.)
//
using System;
using System.Collections.Generic;
using Excel = Microsoft.Office.Interop.Excel;
using Word = Microsoft.Office.Interop.Word;
public class Account
{
public int ID { get; set; }
public double Balance { get; set; }
}
public class Program
{
static void Main(string[] args)
{
var checkAccounts = new List<Account> {
new Account {
ID = 345,
Balance = 541.27
},
new Account {
ID = 123,
Balance = -127.44
}
};
DisplayInExcel(checkAccounts, (account, cell) =>
{
// This multiline lambda will set
// custom processing rules.
cell.Value = account.ID;
cell.Offset[0, 1].Value = account.Balance;
if (account.Balance < 0)
{
cell.Interior.Color = 255;
cell.Offset[0, 1].Interior.Color = 255;
}
});
var word = new Word.Application();
word.Visible = true;
word.Documents.Add();
word.Selection.PasteSpecial(Link: true, DisplayAsIcon: true);
}
public static void DisplayInExcel(IEnumerable<Account> accounts,
Action<Account, Excel.Range> DisplayFunc)
{
var xl = new Excel.Application();
xl.Workbooks.Add();
xl.Visible = true;
xl.Cells[1, 1] = "ID";
xl.Cells[1, 2] = " Balance";
xl.Cells[2, 1].Select();
foreach (var ac in accounts)
{
DisplayFunc(ac, xl.ActiveCell);
xl.ActiveCell.Offset[1, 0].Select();
}
xl.Range["A1:B3"].Copy();
//xl.get_Range("A1:B3").Copy();
xl.Columns[1].AutoFit();
xl.Columns[2].AutoFit();
}
}
at 11:53 AM 0 comments Posted by roni schuetz
Labels: office
Thursday, December 16, 2010
ThreadSafeCache class in C#
using System;
using System.Collections.Generic;
using System.Text;
using System.Threading;
using System.Collections.ObjectModel;
namespace Caching
{
public class ThreadSafeCache<T1, T2>
{
static ReaderWriterLockSlim rwlock = new ReaderWriterLockSlim();
private const int DoNotTimeOut = 0;
private Dictionary<T1, T2> mCache = new Dictionary<T1, T2>();
public bool ContainsKey(T1 key)
{
bool result = false;
rwlock.EnterReadLock();
try
{
result = mCache.ContainsKey(key);
}
finally
{
rwlock.ExitReadLock();
}
return result;
}
public T2 this[T1 index]
{
get
{
T2 result;
rwlock.EnterReadLock();
try
{
result = mCache[index];
}
finally
{
rwlock.ExitReadLock();
}
return result;
}
set
{
rwlock.EnterWriteLock();
try
{
mCache[index] = value;
}
finally
{
rwlock.ExitWriteLock();
}
}
}
public void Add(T1 key, T2 value)
{
rwlock.EnterWriteLock();
try
{
mCache.Add(key, value);
}
finally
{
rwlock.ExitWriteLock();
}
}
public void AddOrIgnore(T1 key, T2 value)
{
rwlock.EnterWriteLock();
try
{
if (!mCache.ContainsKey(key))
mCache.Add(key, value);
}
finally
{
rwlock.ExitWriteLock();
}
}
public void AddOrReplace(T1 key, T2 value)
{
rwlock.EnterWriteLock();
try
{
if (!mCache.ContainsKey(key))
mCache.Add(key, value);
else
mCache[key] = value;
}
finally
{
rwlock.ExitWriteLock();
}
}
public bool Remove(T1 key)
{
bool result = false;
rwlock.EnterWriteLock();
try
{
result = mCache.Remove(key);
}
finally
{
rwlock.ExitWriteLock();
}
return result;
}
public void Clear()
{
rwlock.EnterWriteLock();
try
{
mCache.Clear();
}
finally
{
rwlock.ExitWriteLock();
}
}
public ReadOnlyCollection<T1> Keys
{
get
{
ReadOnlyCollection<T1> result;
rwlock.EnterReadLock();
try
{
result = new ReadOnlyCollection<T1>(new List<T1>(mCache.Keys));
}
finally
{
rwlock.ExitReadLock();
}
return result;
}
}
}
}
at 4:14 PM 0 comments Posted by roni schuetz
Monday, December 13, 2010
High Availability
Availability is traditionally measured according the percentage of time that the system is available to its end users.
Therefore, 100 percent availability means that the system is available all of the time and there is no downtime. However, achieving 100 percent availability is virtually impossible. Too many technical and human factors are involved for that to be a realistic possibility. Even so, by utilizing technology and creating a suitable operating environment, very high levels of availability are achievable. The highest practical measure of availability is typically expressed as “five nines” or 99.999 percent. The percentage of availability can be calculated using the following formula:
Percentage of availability = ((total elapsed time – sum of downtime)/total elapsed time)
The percentage of system availability equals the total elapsed time minus the sum of the system downtime. This result is then divided by the total elapsed time.
Let’s take a little deeper look at what this means in a practical sense. A year has a total of 8,760 hours (24 hours per day × 365 days per year = 8,760 hours). Therefore, an availability of 8,760 hours over a year would be 100 percent uptime as you can see in the following equation:
100 = ((8,760 – 0)/8,760) × 100
A much more common scenario is for a system to have a regular period of downtime every month. For many organizations this might be as little as eight hours of downtime in every month, or essentially two hours per week. This downtime might be the result of planned system maintenance such as system backup procedures. Two hours of downtime per week or eight hours per month results in 98.9 percent availability, as the following formula illustrates:
98.9 = ((8,760 – (8 × 12)/8,760)) × 100
Many organizations don’t achieve that level of availability. However, when expressed as a measure of nine, 98.9 percent isn’t even two nines, as the base level of availability is lower than 99 percent. While one nine may not seem like a high level of availability, for many organizations one day of downtime per month would be perfectly adequate.
However, many businesses are running critical lines of business and e-commerce applications where one nine of availability is not enough. These organizations require three nines or even five nines of availability. So how much actual downtime is associated with these levels of availability? Table 1-1 gives you an idea of the amount of downtime that is permitted with each increasing level of “nines.”
Number of Nines and Downtime
Number of Nines | Percentage Availability | Downtime per Year | Downtime per Month | Downtime per Week |
Two nines | 99.0% | 3.65 days | 7.30 hrs | 1.68 hrs |
Three nines | 99.9% | 8.76 hrs | 43.8 mins | 10.1 mins |
Four nines | 99.99% | 52.6 mins | 4.38 mins | 1.01 mins |
Five nines | 99.999% | 5.26 mins | 26.28 secs | 6.06 secs |
The above table shows how each increasing nine level of availability requires significant decreases in downtime. While reaching two nines of availability can be accomplished with a total of 1.68 hours per week of downtime, five nines of availability is only slightly more than five minutes of downtime per year.
Five minutes of downtime per year is an impressive and difficult number to achieve. Another important factor to remember is that the costs and operational disciplines increase substantially with each successive level of availability. Achieving these higher levels of availability cannot be accomplished using technology only. Creating a highly available environment requires a combination of several factors.
at 3:33 PM 0 comments Posted by roni schuetz
Labels: definition
Friday, November 26, 2010
All-In-One Overview
on codeplex: http://1code.codeplex.com/
All-In-One Windows Forms Code Samples
All-In-One WPF Code Samples
All-In-One Windows Azure Code Samples
All-In-One Silverlight Code Samples
All-In-One ASP.NET Code Samples
All-In-One COM Code Samples
All-In-One Data Platform Code Samples
All-In-One Office Development Code Samples
All-In-One Interop and Fusion Code Samples
All-In-One Windows UI Code Samples
All-In-One Visual Studio Extensibility Code Samples
at 5:14 PM 0 comments Posted by roni schuetz
Labels: links
Tuesday, November 23, 2010
Developer Tools & Platforms Performance [from support.microsoft.com]
http://support.microsoft.com/default.aspx?scid=kb;EN-US;974348 - great stuff ms.
Note: This article discusses a scenario that may be addressed by Microsoft Advisory Services.
Microsoft Advisory Services is an hourly fee-based, consultative support option that provides proactive support beyond your break-fix product maintenance needs. This is a remote, phone-based support option that includes working with the same technician for assistance with issues like product migration, code review, or new program development. This service is typically used for shorter engagements, and is designed for developers and IT professionals who do not require the traditional onsite consulting or sustained account management services that are available from other Microsoft support options. This article also provides some self-help resources for this scenario.
For additional information on Microsoft Advisory Services, including on how to engage, refer to this Microsoft web page:
http://support.microsoft.com/default.aspx?pr=AdvisoryService
Microsoft can provide Performance Tuning services for your software solutions, Microsoft development technologies and platforms such as Common Language Runtime (CLR) framework, Internet Information Server (IIS), developer tools like Visual Studio.NET, and other Microsoft Software Development related tools and solutions. Example includes:
IIS/ASP performance diagnostics and tuning, such as CPU spikes and slow user experience.
CLR performance debugging, profiling, detailed analysis, and architectural best practices.
Team Foundation Server performance issues, such as Work Item Tracking, Source Control, Reporting, and Server Responsiveness.
Below is a list of self-help resources for this scenario. These resources may also be used by Microsoft Support Engineers during an Advisory Services engagement.
IIS 6 Performance Forum
A forum aimed at sharing ideas, techniques for optimizing IIS performance.
http://forums.iis.net/1037.aspx (http://forums.iis.net/1037.aspx)
IIS 7 Performance Forum
Discussion on how to effectively tweak IIS 7 to resolve issues related to performance.
http://forums.iis.net/1050.aspx (http://forums.iis.net/1050.aspx)
PAG Performance Testing Guide
This guide shows you an end-to-end approach for implementing performance testing. Whether you are new to performance testing, or looking for ways to improve your current performance testing approach, you will find insights that you can tailor for your specific scenarios.
http://msdn.microsoft.com/en-us/library/bb924375.aspx
PAG Improving .NET Application Performance and Scalability
This guide provides end-to-end guidance for managing performance and scalability throughout your application life cycle to reduce risk and lower total cost of ownership. It provides a framework that organizes performance into a handful of prioritized categories where your choices heavily impact performance and scalability success.
http://msdn.microsoft.com/en-us/library/ms998530.aspx
Optimizing Performance of Database Access in IIS
This document provides some "best practices" for optimizing database access.
http://msdn.microsoft.com/en-us/library/ms525484.aspx
ASP.NET Performance Monitoring and When to Alert Administrators
Discusses which performance counters are most helpful in diagnosing stress and performance issues in Microsoft ASP.NET applications, what thresholds should be set in order to alert administrators to problems, and other resources that can be used to monitor the health of an ASP.NET application.
http://msdn.microsoft.com/en-us/library/ms972959.aspx
Designing Scalable IIS Applications
This guide provides design considerations for performance and scalability.
http://msdn.microsoft.com/en-us/library/ms525522.aspx
Scaling Strategies for ASP.NET Applications
This MSDN article discusses scaling requirements for ASP.NET applications.
http://msdn.microsoft.com/en-us/magazine/cc500561.aspx?pr=blog
Tools for Optimizing Performance
This document lists brief descriptions of performance-optimizing tools and how to access them.
http://msdn.microsoft.com/en-us/library/dd328379.aspx
Design and Configuration for Performance (ASP.NET)
This topic discusses design, configuration, compilation, and memory options available to improve the performance of a Web application.
http://msdn.microsoft.com/en-us/library/ms227998.aspx
ASP.NET Performance Articles
This document provides a listing of articles that can be used to improve and montior the performance of your applications.
http://msdn.microsoft.com/en-us/library/44e5wy6k(VS.85).aspx
Troubleshooting a Performance Issue with Failed Request Tracing and appcmd in IIS 7
This blog post discusses how to troubleshoot a performance issue in IIS 7.
http://blogs.msdn.com/tess/archive/2008/08/19/troubleshooting-a-performance-issue-with-failed-request-tracing-and-appcmd-in-iis7.aspx
BTess's "Performance Issues and Hangs"
List of blog postings regarding performance issues.
http://blogs.msdn.com/tess/archive/tags/Performance+issues+and+hangs/default.aspx
How to View What ASP.NET Requests Are Doing at Runtime
This blog post discusses a tool with a User Interface that allows you to choose a process and thread and see the managed call stack at that point in time that can be run on IIS 6.0.
http://blogs.msdn.com/webtopics/archive/2009/05/05/how-to-view-what-asp-net-requests-are-doing-at-runtime-on-iis-6-0.aspx
Troubleshooting System.OutOfMemoryExceptions in ASP.NET
This blog post discusses how to troubleshoot OutOfMemoryExceptions in ASP.NET.
http://blogs.msdn.com/webtopics/archive/2009/05/22/Troubleshooting-System.OutOfMemoryExceptions-in-ASP.NET.aspx
Javascript Resources
Profiling Script with the Developer Tools
This page provides an introduction to script profiling, which can be used to identify and resolve performance-related issues.
http://msdn.microsoft.com/en-us/library/dd565629(VS.85).aspx
JScript Debugger in Internet Explorer 8
This blog post provides information on how to use the JScript debugger
http://blogs.msdn.com/jscript/archive/2008/03/13/jscript-debugger-in-internet-explorer-8.aspx
Improved Productivity Through Internet Explorer 8 Developer Tools
This blog post focuses on the developer tools available for Internet Explorer 8.
http://blogs.msdn.com/ie/archive/2008/03/07/improved-productivity-through-internet-explorer-8-developer-tools.aspx
IE + JavaScript Performance Recommendations - Part 1
This blog post discusses ways to improvie performance of JavaScript.
http://blogs.msdn.com/ie/archive/2006/08/28/728654.aspx
IE + JavaScript Performance Recommendations - Part 2: JavaScript Code Inefficiencies
This blog post discusses how to avoid JavaScript code inefficiencies.
http://blogs.msdn.com/ie/archive/2006/11/16/ie-javascript-performance-recommendations-part-2-javascript-code-inefficiencies.aspx
IE + JavaScript Performance Recommendations - Part 3: JavaScript Code Inefficiencies
This blog post focuses on specific inefficiencies related to closures and object-oriented programming.
http://blogs.msdn.com/ie/archive/2007/01/04/ie-jscript-performance-recommendations-part-3-javascript-code-inefficiencies.aspx
Performance Optimization of Arrays - Part 1
This blog post discusses how to improve performance of Array operations
http://blogs.msdn.com/jscript/archive/2008/03/25/performance-optimization-of-arrays-part-i.aspx
Performance Optimization of Arrays - Part II
This blog post continues the discussion on how to improve performance of Array operations.
http://blogs.msdn.com/jscript/archive/2008/04/08/performance-optimization-of-arrays-part-ii.aspx
Performance issues with "String Concatenation" in JScript
This blog post focuses on how to improve the performance of string concatenations in JScript.
http://blogs.msdn.com/jscript/archive/2007/10/17/performance-issues-with-string-concatenation-in-jscript.aspx
Eval is Evil - Part 1
This blog post provides alternative solutions to the use of the Eval method.
http://blogs.msdn.com/ericlippert/archive/2003/11/01/53329.aspx (http://blogs.msdn.com/ericlippert/archive/2003/11/01/53329.aspx)
Eval is Evil - Part two
This blog post continues the discussion of alternate solutions to the Eval method.
http://blogs.msdn.com/ericlippert/archive/2003/11/04/53335.aspx (http://blogs.msdn.com/ericlippert/archive/2003/11/04/53335.aspx)
Networking Resources
How to capture network traffic with Network Monitor
The purpose of this article is to provide you with the information needed to capture network traffic from a local area network using Microsoft's Network Monitor.
http://support.microsoft.com/kb/148942
Fiddler PowerToy - Part 1: HTTP Debugging
Learn how to use the Microsoft Fiddler HTTP debugger when developing and testing Web applications and clients.
http://msdn.microsoft.com/en-us/library/bb250446(VS.85).aspx
Fiddler PowerToy - Part 2: HTTP Performance
Learn how to build a faster Web site using the Microsoft Fiddler HTTP Debugger.
http://msdn.microsoft.com/en-us/library/bb250442(VS.85).aspx (http://msdn.microsoft.com/en-us/library/bb250442(VS.85).aspx)
Part 2: TCP Performance Expert and General Trouble Shooting
This blog discusses TCP Performance Expert and General Troubleshooting skills.
http://blogs.technet.com/netmon/archive/2007/01/26/part-2-tcp-performance-expert-and-general-trouble-shooting.aspx (http://blogs.technet.com/netmon/archive/2007/01/26/part-2-tcp-performance-expert-and-general-trouble-shooting.aspx)
TCP Analyzer Expert: Make Your Network Run Faster
This blog post focuses on how to use the TCP Analyzer Expert.
http://blogs.technet.com/netmon/archive/2009/06/30/tcp-analyzer-expert-make-your-network-run-faster.aspx (http://blogs.technet.com/netmon/archive/2009/06/30/tcp-analyzer-expert-make-your-network-run-faster.aspx)
Tools Resources
Fiddler Web Debugging Proxy
Fiddler is a Web Debugging Proxy which logs all HTTP(S) traffic between your computer and the Internet. This document discusses use of the tool.
http://www.fiddlertool.com/fiddler/
AjaxScope
Ajax View enables developer to see and control the behaviors of their web applications on user's desktops. This page discucess how tu use this tool.
http://research.microsoft.com/en-us/projects/ajaxview/
Microsoft Network Monitor
Tool to allow capturing and protocol analysis of network traffic.
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=983b941d-06cb-4658-b7f6-3088333d062f (http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=983b941d-06cb-4658-b7f6-3088333d062f)
Debugging tools for Windows
You can use Debugging Tools for Windows to debug drivers, applications, and services on systems that are running Windows NT 4.0, Windows 2000, Windows XP, Windows Server 2003, Windows Vista, or Windows Server 2008.
http://www.microsoft.com/whdc/devtools/debugging/default.mspx (http://www.microsoft.com/whdc/devtools/debugging/default.mspx)
DebugDiag
The Debug Diagnostic Tool (DebugDiag) is designed to assist in troubleshooting issues such as hangs, slow performance, memory leaks or fragmentation, and crashes in any Win32 user-mode process.
http://www.microsoft.com/downloads/details.aspx?FamilyID=28bd5941-c458-46f1-b24d-f60151d875a3&displaylang=en (http://www.microsoft.com/downloads/details.aspx?FamilyID=28bd5941-c458-46f1-b24d-f60151d875a3&displaylang=en)
neXpert Performance Tool
Discussion On Using Fiddler and neXpert To Identify and Fix Web Performance Issues.
http://blogs.msdn.com/nexpert/ (http://blogs.msdn.com/nexpert/)
Strace
STRACE is a socket/SSL tracer designed to generate LOG for Internet Explorer.
http://www.microsoft.com/downloads/details.aspx?familyid=F5EC767F-27F2-4FB3-90A5-4BF0D5F4810A&displaylang=en (http://www.microsoft.com/downloads/details.aspx?familyid=F5EC767F-27F2-4FB3-90A5-4BF0D5F4810A&displaylang=en)
HTTPReplay
HTTPREPLAY is a SOCKTRC plugin allowing to analyze and replay HTTP traffic.
http://www.microsoft.com/downloads/details.aspx?familyid=d25ba362-c17b-4d80-a677-1faff862e629&displaylang=en&tm
Design Resources
Designing for Add-on Performance
Blog post describing how to improve add-on performance with Internet Explorer.
http://blogs.msdn.com/ie/archive/2008/04/04/designing-for-add-on-performance.aspx
Performance Considerations in Internet Explorer
This page provides links and tips for getting extra performance from DHTML, Script, Web Servers, ActiveX Controls, Java Applets, and Plugins.
http://msdn.microsoft.com/en-us/library/ms533021(VS.85).aspx
Faster DHTML in 12 Steps
This article describes how using some DHTML features can affect performance more than others, and it presents tips that will help your pages perform faster.
http://msdn.microsoft.com/en-us/library/ms533019(VS.85).aspx (http://msdn.microsoft.com/en-us/library/ms533019(VS.85).aspx)
Building High Performance HTML Pages
This article presents some tips on how you can get the most performance out of your pages.
http://msdn.microsoft.com/en-us/library/ms533020(VS.85).aspx
Frequent Flyers: Boosting Performance on DHTML Pages
This blog post discusses how to get better performance from your Dynamic HTML pages.
http://msdn.microsoft.com/en-us/library/bb264005(VS.85).aspx
Asynchrony: Loved Your Performance
This article discusses how to improve performance with asynchrony.
http://msdn.microsoft.com/en-us/library/bb263994(VS.85).aspx
Building ActiveX Controls for Internet Explorer
This article covers features of Windows Internet Explorer that a developer writing Microsoft ActiveX Controls should take into account when targeting Internet Explorer as a container.
http://msdn.microsoft.com/en-us/library/aa751970(VS.85).aspx
Memory Leak Resources
Understanding and Solving Internet Explorer Leak Patterns
This article discusses how to troubleshoot and resolve memory leaks.
http://msdn.microsoft.com/en-us/library/bb250448(VS.85).aspx
General Resources
Learn Internet Explorer
A series of topics designed to teach Internet Explorer programming.
http://msdn.microsoft.com/en-us/ie/aa740473.aspx
Measuring Browser Performance: Understanding issues in benchmarking and performance analysis
This document explains the various browser and network components and how each piece can impact performance when benchmarking
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=cd8932f3-b4be-4e0e-a73b-4a373d85146d
IE8 Performance
Blog post discussing the performance changes in Internet Explorer 8.
http://blogs.msdn.com/ie/archive/2008/08/26/ie8-performance.aspx
Common Issues in Assessing Browser Performance
This blog post focuses on performance with a discussion around some of the issues impacting browser performance testing and the techniques that you can use to effectively measure browser performance.
http://blogs.msdn.com/ie/archive/2009/01/23/common-issues-in-assessing-browser-performance.aspx
How to improve browsing performance in Internet Explorer
This article describes how to improve browsing performance in Internet Explorer.
http://support.microsoft.com/kb/153790
How to optimize Internet Explorer
This article describes how to reset or optimize Internet Explorer 7.
http://support.microsoft.com/kb/936213
How to troubleshoot Internet Explorer issues in Windows Vista and in Windows XP
This article discusses how to troubleshoot possible issues that you may experience when you use Windows Internet Explorer 7 or Windows Internet Explorer 8 on a computer that is running Windows Vista.
http://support.microsoft.com/kb/936215
Note This is a "FAST PUBLISH" article created directly from within the Microsoft support organization. The information contained herein is provided as-is in response to emerging issues. As a result of the speed in making it available, the materials may include typographical errors and may be revised at any time without notice. See Terms of Use (http://go.microsoft.com/fwlink/?LinkId=151500) for other considerations.
at 12:57 PM 0 comments Posted by roni schuetz
Labels: .net, performance
Tuesday, November 16, 2010
nServiceBus Licenses
really cool projects
******************************
NHibernate is licensed under the LGPL v2.1 license as described here:
http://www.hibernate.org/license.html
NHibernate binaries are merged into NServiceBus allowed under the LGPL license terms found here:
http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt
******************************
LinFu is licensed under the LGPL v3 license as described here:
http://code.google.com/p/linfu/
LinFu binaries are merged into NServiceBus allowed under the LGPL license terms found here:
http://www.gnu.org/licenses/lgpl-3.0.txt
******************************
Iesi.Collections binaries are merged into NServiceBus allowed under the license terms found here:
Copyright © 2002-2004 by Aidant Systems, Inc., and by Jason Smith.
Copied from http://www.codeproject.com/csharp/sets.asp#xx703510xx that was posted by JasonSmith 12:13 2 Jan '04
Feel free to use this code any way you want to. As a favor to me, you can leave the copyright in there. You never know when someone might recognize your name!
If you do use the code in a commercial product, I would appreciate hearing about it. This message serves as legal notice that I won't be suing you for royalties! The code is in the public domain.
On the other hand, I don't provide support. The code is actually simple enough that it shouldn't need it.
******************************
Fluent NHibernate is licensed under the BSD license as described here:
http://github.com/jagregory/fluent-nhibernate/raw/master/LICENSE.txt
Fluent NHibernate binaries are merged into NServiceBus allowed under the terms of the license.
******************************
Autofac is licensed under the MIT license as described here:
http://code.google.com/p/autofac/
Autofac binaries are linked into the NServiceBus distribution allowed under the license terms found here:
http://www.opensource.org/licenses/mit-license.php
******************************
Spring.NET is licensed under the Apache license version 2.0 as described here:
http://www.springframework.net/license.html
Spring.NET binaries are merged into NServiceBus allowed under the license terms found here:
http://www.apache.org/licenses/LICENSE-2.0.txt
******************************
Antlr is licensed under the BSD license as described here:
http://antlr.org/license.html
Antlr binaries are merged into NServiceBus allowed under the license terms described above.
******************************
Common.Logging is licensed under the Apache License, Version 2.0 as described here:
http://netcommon.sourceforge.net/license.html
Common.Logging binaries are merged into NServiceBus allowed under the LGPL license terms found here:
http://www.apache.org/licenses/LICENSE-2.0.txt
******************************
StructureMap is licensed under the Apache License, Version 2.0 as described here:
http://structuremap.github.com/structuremap/index.html
StructureMap baries are linked into the NServiceBus distribution allowed under the license terms found here:
http://www.apache.org/licenses/LICENSE-2.0.txt
******************************
Castle is licensed under the Apache License, Version 2.0 as described here:
http://www.castleproject.org/
Castle binaries are linked into the NServiceBus distribution allowed under the license terms found here:
http://www.apache.org/licenses/LICENSE-2.0.txt
******************************
Unity is licensed under the MSPL license as described here:
http://unity.codeplex.com/license
Unity binaries are linked into the NServiceBus distribution allowed under the license terms described above.
******************************
Log4Net is licensed under the Apache License, Version 2.0 as described here:
http://logging.apache.org/log4net/license.html
Log4Net binaries are linked into the NServiceBus distribution allowed under the license terms described above.
******************************
TopShelf is licensed under the Apache License, Version 2.0 as described here:
http://code.google.com/p/topshelf/
TopShelf binaries are merged into NServiceBus as allowed under the license terms described here:
http://www.apache.org/licenses/LICENSE-2.0.txt
******************************
SQLite is in the public domain as described here:
http://www.sqlite.org/copyright.html
SQLite binaries are linked into the NServiceBus distribution allowed under the license terms described above.
******************************
Rhino Mocks is licensed under the BSD License as described here:
http://www.ayende.com/projects/rhino-mocks.aspx
Rhino Mocks binaries are merged into NServiceBus allowed under the license terms described here:
http://www.opensource.org/licenses/bsd-license.php
at 12:14 AM 0 comments Posted by roni schuetz
Labels: .net, open source
Monday, November 08, 2010
writing windows shell extension
Part 1: http://blogs.msdn.com/b/codefx/archive/2010/09/14/writing-windows-shell-extension-with-net-framework-4-c-vb-net-part-1.aspx
Part 2: http://blogs.msdn.com/b/codefx/archive/2010/10/10/writing-windows-shell-extension-with-net-framework-4-c-vb-net-part-2.aspx?wa=wsignin1.0
Part 3: http://blogs.msdn.com/b/codefx/archive/2010/11/07/writing-windows-shell-extension-with-net-framework-4-c-vb-net-part-3-thumbnail-handler.aspx
at 8:26 AM 0 comments Posted by roni schuetz
Labels: .net 4, links, Windows Shell
Thursday, September 30, 2010
LINQ Evaluation From Basics to Implementation
Many of us are aware of LINQ today. LINQ is an amazing programming language feature. During my discussions I find a lot of myths around understanding. Here I have tried to demonstrate from some basics of C# 3.0 language enhancements to delegate, anonymous type, lambda expression and finally LINQ. How it has evolved
http://channel9.msdn.com/posts/LINQ-Evaluation-From-Basics-to-Implementation
at 6:24 PM 0 comments Posted by roni schuetz
Sunday, September 19, 2010
Monday, July 19, 2010
Strong name validation failed. (Exception from HRESULT: 0x8013141A)
System.Security.SecurityException: Strong name validation failed. (Exception from HRESULT: 0x8013141A) The Zone of the assembly that failed was "My Computer"
step 1) takeover the following code: http://www.mikevdm.com/BlogEntry/Key/Simple-Gacutil-Replacement
step 2) add a post build event:
- $(SolutionDir)Tools\Gacutil.exe remove $(TargetPath)
- $(SolutionDir)Tools\Gacutil.exe add $(TargetPath)
step 3) run from the console: D:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\NETFX 4.0 Tools\x64>sn.exe -Vr *,[public key token]
estimated time solver: 30 - 60 minutes
at 12:16 PM 0 comments Posted by roni schuetz
Labels: .net 4
assign TFS Workitems to different users
The following code display how it's possible to set the assign-to field from one user to another user. I had to write this code because in my instance the console command did not work:
TFSConfig Identities /change /fromdomain:Contoso1 /todomain:ContosoPrime /account:Contoso1\hholt /toaccount:ContosoPrime\jpeoples
Since the created-by field is a read only field this code will not change the value but if your really need to handle this then I think it should be possible to manipulate the XML of your Workitems and then handle it. Sounds to me like a lot of work.
Add the following using's to your classes:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Net;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation;
using Microsoft.TeamFoundation.Server;
using Microsoft.TeamFoundation.Framework.Client;
using System.Collections.ObjectModel;
using Microsoft.TeamFoundation.Framework.Common;
using Microsoft.TeamFoundation.WorkItemTracking.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
Helper methods:
public static string url = @"https://tfs.yourdomain.com/tfs";
[System.Diagnostics.DebuggerStepThrough]
public static TfsConfigurationServer GetTfsConfigurationServer()
{
return new TfsConfigurationServer(new Uri(url), GetCredentials(), new UICredentialsProvider());
}
[System.Diagnostics.DebuggerStepThrough]
private static ICredentials GetCredentials()
{
return new NetworkCredential("user", "password", "domain");
}
[System.Diagnostics.DebuggerStepThrough]
public static bool EnsureAuthentication(TfsConfigurationServer srv)
{
bool result = true;
try
{
srv.EnsureAuthenticated();
srv.Authenticate();
result = srv.HasAuthenticated;
}
catch (Exception)
{
result = false;
}
return result;
}
The main code to re-assign your workitem assign-to field over all TFS collections:
this code is stright forward without any add ons or so... enjoy
static void Main(string[] args)
{
List<Identity> tfsIdentities = Helper.GetAllTfsUsers();
Console.WriteLine("Connecting to Server: " + Helper.url);
TfsConfigurationServer srv = Helper.GetTfsConfigurationServer();
Console.WriteLine("Ensure Authenticated: " + Helper.url);
srv.EnsureAuthenticated();
srv.Authenticate();
int counter = 0;
if (srv.HasAuthenticated)
{
CatalogNode configurationServerNode = srv.CatalogNode;
// Query the children of the configuration server node for all of the team project collection nodes
ReadOnlyCollection<CatalogNode> tpcNodes = configurationServerNode.QueryChildren(
new Guid[] { CatalogResourceTypes.ProjectCollection },
false,
CatalogQueryOptions.None);
foreach (CatalogNode tpcNode in tpcNodes)
{
Guid tpcId = new Guid(tpcNode.Resource.Properties["InstanceId"]);
TfsTeamProjectCollection tpc = srv.GetTeamProjectCollection(tpcId);
// Do your tpc work here.
Console.WriteLine("{0}", tpc.Name);
// get a reference to the work item tracking service
var workItemStore = tpc.GetService<WorkItemStore>();
if (workItemStore.Projects.Count <= 0)
{
// go over the next project
continue;
}
// iterate over the projects
foreach (Project project in workItemStore.Projects)
{
Console.WriteLine("\tProject: {0}", project.Name);
VersionControlServer versionControl = (VersionControlServer)tpc.GetService(typeof(VersionControlServer));
TeamProject teamProject = versionControl.GetTeamProject(project.Name);
IGroupSecurityService gss = (IGroupSecurityService)tpc.GetService<IGroupSecurityService>();
Identity[] appGroups = gss.ListApplicationGroups(teamProject.ArtifactUri.AbsoluteUri);
foreach (Identity group in appGroups)
{
Identity[] groupMembers = gss.ReadIdentities(SearchFactor.Sid, new string[] { group.Sid }, QueryMembership.Expanded);
foreach (Identity member in groupMembers)
{
if (member.Members != null)
{
foreach (string memberSid in member.Members)
{
Identity memberInfo = gss.ReadIdentity(SearchFactor.Sid, memberSid, QueryMembership.None);
if (memberInfo.Type == IdentityType.WindowsUser)
{
// Console.WriteLine("\t" + memberInfo.AccountName + " - " + memberInfo.DisplayName + " - " + memberInfo.Domain);
if (!tfsIdentities.Contains(memberInfo))
tfsIdentities.Add(memberInfo);
}
}
}
}
}
WorkItemCollection workItemCollection = workItemStore.Query(
" SELECT [System.Id], [System.WorkItemType]," +
" [System.State], [System.AssignedTo], [System.Title] " +
" FROM WorkItems " +
" WHERE [System.TeamProject] = '" + project.Name +
"' ORDER BY [System.WorkItemType], [System.Id]");
foreach (WorkItem item in workItemCollection)
{
counter++;
bool containsOldDomain = item[CoreField.AssignedTo].ToString().Contains(@"OldValidUser");
if (containsOldDomain)
{
WorkItem item1 = workItemStore.GetWorkItem(item.Id);
item1.Fields[CoreField.AssignedTo].Value = "NewValidUser";
try
{
bool a = (item1.Validate().Count == 0);
if (a)
{
Console.WriteLine("\t\tValid: " + item.Id + item.Title);
item1.Save();
}
else
{
Console.WriteLine("\t\tNot Valid: " + item.Id + item.Title);
}
}
catch (Exception ex)
{
Console.WriteLine(item.Id + ": " + ex.Message);
}
}
}
// clear users
tfsIdentities.Clear();
}
}
}
Console.WriteLine("Workitem count {0}", counter);
Console.WriteLine("Press enter to exit");
Console.ReadLine();
}
at 8:36 AM 0 comments Posted by roni schuetz
Retrieve a list with all your TFS Users
The following code demonstrates how to read all users from different TFS collections.
namespace Migrator
{
using System;
using System.Linq;
using System.Reflection;
using Microsoft.TeamFoundation.Client;
using System.Net;
using Microsoft.TeamFoundation.Server;
using System.Collections.Generic;
using Microsoft.TeamFoundation.Framework.Client;
using System.Collections.ObjectModel;
using Microsoft.TeamFoundation.Framework.Common;
using Microsoft.TeamFoundation.WorkItemTracking.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
public static class Helper
{
// without collection name
public static string url = @"http://tfs.yoururl.com/tfs";
[System.Diagnostics.DebuggerStepThrough]
public static TfsConfigurationServer GetTfsConfigurationServer()
{
return new TfsConfigurationServer(new Uri(url), GetCredentials(), new UICredentialsProvider());
}
[System.Diagnostics.DebuggerStepThrough]
private static ICredentials GetCredentials()
{
return new NetworkCredential("adminuser", "password", "YourDomain");
}
[System.Diagnostics.DebuggerStepThrough]
public static bool EnsureAuthentication(TfsConfigurationServer srv)
{
bool result = true;
try
{
srv.EnsureAuthenticated();
srv.Authenticate();
result = srv.HasAuthenticated;
}
catch (Exception)
{
result = false;
}
return result;
}
public static List<Identity> GetAllTfsUsers()
{
List<Identity> result = new List<Identity>();
TfsConfigurationServer srv = Helper.GetTfsConfigurationServer();
if (EnsureAuthentication(srv))
{
CatalogNode configurationServerNode = srv.CatalogNode;
// Query the children of the configuration server node for all of the team project collection nodes
ReadOnlyCollection<CatalogNode> tpcNodes = configurationServerNode.QueryChildren(
new Guid[] { CatalogResourceTypes.ProjectCollection },
false,
CatalogQueryOptions.None
);
foreach (CatalogNode tpcNode in tpcNodes)
{
Guid tpcId = new Guid(tpcNode.Resource.Properties["InstanceId"]);
TfsTeamProjectCollection tpc = srv.GetTeamProjectCollection(tpcId);
Console.WriteLine("{0}", tpc.Name);
// get a reference to the work item tracking service
var workItemStore = tpc.GetService<WorkItemStore>();
// go over the next node if no projects available
if (workItemStore.Projects.Count <= 0)
{
continue;
}
// iterate over the projects
foreach (Project project in workItemStore.Projects)
{
Console.WriteLine("\tProject: {0}", project.Name);
try
{
VersionControlServer versionControl = (VersionControlServer)tpc.GetService(typeof(VersionControlServer));
TeamProject teamProject = versionControl.GetTeamProject(project.Name);
IGroupSecurityService gss = (IGroupSecurityService)tpc.GetService<IGroupSecurityService>();
Identity[] appGroups = gss.ListApplicationGroups(teamProject.ArtifactUri.AbsoluteUri);
foreach (Identity group in appGroups)
{
Identity[] groupMembers = gss.ReadIdentities(SearchFactor.Sid, new string[] { group.Sid }, QueryMembership.Expanded);
foreach (Identity member in groupMembers)
{
if (member.Members != null)
{
foreach (string memberSid in member.Members)
{
Identity memberInfo = gss.ReadIdentity(SearchFactor.Sid, memberSid, QueryMembership.None);
if (memberInfo.Type == IdentityType.WindowsUser)
{
if (!result.Contains(memberInfo))
{
result.Add(memberInfo);
Console.WriteLine("\t\t" + memberInfo.AccountName + " - " + memberInfo.DisplayName + " - " + memberInfo.Domain);
}
else
{
Console.WriteLine("\t\tUser already available " + memberInfo.AccountName);
}
}
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine("\tThe Project: '{0}' throws an exception: {1} and will be ignored.", project.Name, ex.Message);
}
} // foreach (Project project in workItemStore.Projects)
} // foreach (CatalogNode tpcNode in tpcNodes)
}
else
{
Console.WriteLine("Authentication problem!");
}
return result;
}
}
}
at 8:29 AM 0 comments Posted by roni schuetz
Sunday, July 18, 2010
What is System Engineering
System Engineering is an interdisciplinary field of engineering that focus on how complex engineering projects should be designed and managed. Coordination of different teams and automatic control of machinery become more difficult when dealing with large, complex and a big amount of projects.
System Engineering deals with work processes and tools to handle projects, and it overlaps with both technical and human-centric disciplines such as control engineering and project management.
Concept
System Engineering signifies both an approach and, more recently, as a discipline in engineering. The aim of education in System Engineering is to simply formalize the approach and in doing so, identify new methods and research opportunities similar to the way it occurs in other fields of engineering. As an approach, System Engineering is holistic and interdisciplinary in flavor.
Holistic View
System Engineering focus on defining customer needs and required functionality early in the development cycle, documenting requirements, then proceeding with design synthesis and system validation while considering the complete problem, the system lifecycle. System Engineering process can be decomposed into:
• a System Engineering Technical Process
• a System Engineering Management Process
Managing complexity
The need for system engineering arose with the increase in complexity of systems and projects. When speaking in this complexity incorporates not only engineering systems, but also the logical human organization of data. At the same time, a system can become more complex due to an increase in size as well as with an increase in the amount of data, variables, or the number of fields that are involved in the design.
Scope
One way to understand the motivation behind system engineering is to see it as a method, or practice, to identify and improve common rules that exist within a wide variety of systems. Keeping this aspect in mind, the principles of System Engineering can be applied to any system, complex or otherwise, provided system thinking is employed at all levels.
System engineering encourages the use of modeling and simulation to validate assumptions or theories on systems and the interactions with and within them.
Use of methods that allow early detection of possible failures, in Safety engineering, are integrated into the design process. At the same time, decisions made at the beginning of a project whose consequences are not clearly understood can have enormous implications later in the life cycle of a system, and it is the task of the modern system engineer to explore these issues and make critical decisions. There is no method which guarantees that decisions made today will stay be valid when a system goes into service years or decades after it is first conceived but there are techniques to support the process of system engineering.
at 7:27 PM 0 comments Posted by roni schuetz
Add Bookmark over JS
//add-bookmark
//function bookmark(anchor){
if(window.external)
{
window.external.AddFavorite(anchor.getAttribute('href'), anchor.getAttribute('title'));
return false;
}
return true;
}
//]]>
at 7:14 PM 0 comments Posted by roni schuetz
Labels: java script
Sunday, June 20, 2010
Paint.Net alternative to Photoshop
if you are using paint.net
then checkout this site [http://paint.net.amihotornot.com.au/] with a great overview for various plug-in's
at 1:04 PM 0 comments Posted by roni schuetz
Labels: image
Sunday, May 30, 2010
Top Shoots from my tenerife scoba dive week - mai 2010
more pictures can be found on my facebook profile (pic. are public available) .
If you like to dive in tenerife so I can suggest to visit Aqua-Marina. Great stuff available over there and they speak various languages - a multi-culti stuff is around there.
at 3:34 PM 0 comments Posted by roni schuetz
Labels: scuba diving, underwater photography
Sunday, May 09, 2010
serializing.info
bought a new domain name: serializing.info
under this domain I would like to aggregate various information around serializing. if you have content which you would like to distribute here you're welcome to send me an email.
at 1:39 PM 0 comments Posted by roni schuetz
Labels: general
Finance Terms - always good to know
1. NAV
a. Net Asset Value
b. The value of the mutual –fund calculated daily (sometime more)
c. Total value of fund-dividend \ num of shares issued
2.OCIC
a. Open End Investment Company
b. A type of a mutual fund
3.SEC
a. Securities and Exchange Commission
4.IRA
a. Individual retirement Account
b. Allow to set 2K $ per year in MF without tax
5.ICI
a. Investment Company Institute
b. National association of Investment companies
6. Sector-Funds
a. A type of industry to invest in
b. Technology, biomedical, etc…
7. POP
a. Public Offering Price
b. When selling a MF
c. NAV + sales charge
8. Turnover
a. When a MF invests in securities, it sells and buys
these securities if a capital gain was achieved the
share holders of the MF will be taxed and will also
pay for the buying \ selling fees
b. Tot buy + sell \ 2 * fund total holdings
9. MF – FEES
a. Management fees
actual fees on managing the fund + administration
b. Non management fees
fees for other non managing parties like custodial,
accountant, board of directors, SEC registration
fee etc…
c. 12b-1 + non 12b-1 fees
12b1 – Marketing expenses of the fund according to the SEC rules
non 12b1 – other marketing not under SEC rules
d. Fees by the investor based on arrangements with the investor's broker
10. CDSL
a. Contingent Deferred Sales Load
11. TER
a. funds Total Expense Ratio
at 12:06 AM 0 comments Posted by roni schuetz
Labels: general
Saturday, May 08, 2010
Disk I/O optimization
Disk I/O refers to the number of read and write operations performed by your
application on a physical disk or multiple disks installed in your server. Common
activities that can cause disk I/O-related bottlenecks include long-running file I/O
operations, data encryption and decryption, reading unnecessary data from database
tables, and a shortage of physical memory that leads to excessive paging activity.
Slow hard disks are another factor to consider.
To resolve disk-related bottlenecks:
- Start by removing any redundant disk I/O operations in your application.
- Identify whether your system has a shortage of physical memory, and,
if so, add more memory to avoid excessive paging.
- Identify whether you need to separate your data onto multiple disks.
- Consider upgrading to faster disks if you still have disk I/O
bottlenecks after doing all of above.
Configuration Overview
Microsoft Windows retrieves programs and data from disk. The disk
subsystem can be the most important aspect of I/O performance, but
problems can be masked by other factors, such as lack of memory.
Performance console disk counters are available within both the
LogicalDisk or PhysicalDisk objects.
Metrics
PhysicalDisk
- Avg. Disk Queue Length
- Avg. Disk Read Queue Length
- Avg. Disk Write Queue Length
- Avg. Disk sec/Read
- Avg. Disk sec/Transfer
- Disk Writes/sec
Tuning Options
If you determine that disk I/O is a bottleneck, you have a number of options:
- Defragment your disks. Use the Disk Defragmenter system tool.
- Use Diskpar.exe on Windows 2000 to reduce performance loss due to misaligned
disk tracks and sectors. You can use get the Diskpar.exe from the Windows 2000
Resource Kit.
- Use stripe sets to process I/O requests concurrently over multiple disks. The
type you use depends on your data-integrity requirements. If your applications
are read-intensive and require fault tolerance, consider a RAID 5 volume. Use
mirrored volumes for fault tolerance and good I/O performance overall. If you do
not require fault tolerance, implement stripe sets for fast reading and writing and
improved storage capacity. When stripe sets are used, disk utilization per disk
should fall due to distribution of work across the volumes, and overall throughput
should increase.
- If you find that there is no increased throughput when scaling to additional disks
in a stripe set, your system might be experiencing a bottleneck due to contention
between disks for the disk adapter. You might need to add an adapter to better
distribute the load.
- Place multiple drives on separate I/O buses, particularly if a disk has an
I/O-intensive workload.
- Distribute workload among multiple drives. Windows Clustering and
Distributed File System provide solutions for load balancing on different drives.
- Limit your use of file compression or encryption. File compression and
encryption are I/O-intensive operations. You should only use them where
absolutely necessary.
- Disable creation of short names. If you are not supporting MS-DOS for Windows
3.x clients, disable short names to improve performance. To disable short names,
change the default value of the \NtfsDisable8dot3NameCreation registry entry
(in HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Filesystem) to 1.
- Disable last access update. By default, NTFS updates the date and time stamp
of the last access on directories whenever it traverses the directory. For a large
NTFS volume, this update process can slow performance. To disable automatic
updating, create a new REG_DWORD registry entry named
NtfsDisableLastAccessUpdate in HKEY_LOCAL_MACHINE\SYSTEM\CurrentContolSet\Control\Filesystem and set its value to 1.
Reserve appropriate space for the master file table. Add the
NtfsMftZoneReservation entry to the registry as a REG_DWORD in
HKEY_LOCAL_MACHINE \SYSTEM \CurrentControlSet\Control \FileSystem.
When you add this entry to the registry, the system reserves space on the volume
for the master file table. Reserving space in this manner allows the master file
table to grow optimally. If your NTFS volumes generally contain relatively few
files that are large, set the value of this registry entry to 1 (the default).
- Typically you can use a value of 2 or 3 for moderate numbers of files, and use a value of 4 (the maximum) if your volumes tend to contain a relatively large number of files. However, make sure to test any settings greater than 2, because these greater values cause the system to reserve a much larger portion of the disk for the master file table.
- Use the most efficient disk systems available, including controller, I/O, cabling,
and disk. Use intelligent drivers that support interrupt moderation or interrupt
avoidance to alleviate the interrupt activity for the processor due to disk I/O.
- Check whether you are using the appropriate RAID configuration. Use RAID 10
(striping and mirroring) for best performance and fault tolerance. The tradeoff is
that using RAID 10 is expensive. Avoid using RAID 5 (parity) when you have
extensive write operations.
- Consider using database partitions. If you have a database bottleneck, consider
using database partitions and mapping disks to specific tables and transaction
logs. The primary purpose of partitions is to overcome disk bottlenecks for large
tables. If you have a table with large number of rows and you determine that it is
the source of a bottleneck, consider using partitions. For SQL Server, you can use
file groups to improve I/O performance. You can associate tables with file groups,
and then associate the file groups with a specific hard disk.
- Consider splitting files across hard disks. If you are dealing with extensive
file-related operations, consider splitting the files across a number of hard disks
to spread the I/O load across multiple disks.
at 11:24 PM 0 comments Posted by roni schuetz
Labels: Windows
Memory
Memory consists of physical and virtual memory. You need to consider how much memory is allocated to your application. When you evaluate memory-related bottlenecks, consider unnecessary allocations, inefficient clean up, and inappropriate caching and state management mechanisms. To resolve memory-related bottlenecks, optimize your code to eliminate these issues and then tune the amount of memory allocated to your application. If you determine during tuning that memory contention and excessive paging are occurring, you may need to add more physical memory to the server.
Low memory leads to increased paging where pages of your application’s virtual
address space are written to and from disk. If paging becomes excessive, page
thrashing occurs and intensive disk I/O decreases overall system performance.
Configuration Overview
Memory tuning consists of the following:
- Determine whether your application has a memory bottleneck. If
it has, then add more memory.
- Tune the amount of memory allocated if you can control the
allocation. For example, you can tune this for ASP.NET and
SQL Server.
- Tune the page file size.
Metrics
The performance counters help you identify memory bottlenecks. You should log these counter values to log files over a 24 hour period before you form any conclusions.
Memory
- Available MBytes
- Page Reads/sec
- Pages/sec
- Cache Bytes
- Cache Faults/sec
Server
- Pool Nonpaged Failures
- Pool Nonpaged Peak
Cache
- MDL Read Hits %
Bottlenecks
A low value of Available MBytes indicates that your system is low on physical
memory, caused either by system memory limitations or an application that is not
releasing memory. Monitor each process object’s working set counter. If Available
MBytes remains high even when the process is not active, it might indicate that the
object is not releasing memory. Use the CLR Profiler tool at this point to identify the source of any memory allocation problems. For more information, see “How To: Use
CLR Profiler” in the “How To” section of this guide.
A high value of Pages/sec indicates that your application does not have sufficient
memory. The average of Pages Input/sec divided by average of Page Reads/sec gives
the number of pages per disk read. This value should not generally exceed five pages
per second. A value greater than five pages per second indicates that the system is
spending too much time paging and requires more memory (assuming that the
application has been optimized).
Tuning Options
If you determine that your application has memory issues, your options include
adding more memory, stopping services that you do not require, and removing
unnecessary protocols and drivers. Tuning considerations include:
- Deciding when to add memory
- Page file optimization
Deciding When to Add Memory
To determine the impact of excessive paging on disk activity, multiply the values of
the Physical Disk\ Avg. Disk sec/Transfer and Memory\ Pages/sec counters. If the
product of these counters exceeds 0.1, paging is taking more than 10 percent of disk
access time. If this occurs over a long period, you probably need more memory. After
upgrading your system’s memory, measure and monitor again.
To save memory:
- Turn off services you do not use. Stopping services
that you do not use regularly saves memory and improves
system performance.
- Remove unnecessary protocols and drivers. Even idle
protocols use space in the paged and nonpaged memory
pools. Drivers also consume memory, so you should remove
unnecessary ones.
Page File Optimization
You should optimize the page file to improve the virtual memory performance of
your server. The combination of physical memory and the page file is called the
virtual memory of the system. When the system does not have enough physical
memory to execute a process, it uses the page file on disk as an extended memory
source. This approach slows performance.
To ensure an optimized page file:
- Increase the page file size on the system to 1.5 times
the size of physical memory available, but only to
a maximum of 4,095 MB. The page file needs to be at least
the size of the physical memory to allow the memory to be
written to the page file in the event of a system crash.
- Make sure that the page file is not fragmented on a given partition.
- Separate the data files and the page file to different
disks only if the disk is a bottleneck because of a lot
of I/O operation. These files should preferably be on the
same physical drive and the same logical partition. This
keeps the data files and the page file physically close to
each other and avoids the time spent seeking between two
different logical drives.
To configure the page file size
1. Open Control Panel.
2. Double-click the System icon.
3. Select the Advanced tab.
4. Click Performance Options.
5. Click Change. The Virtual Memory dialog box appears
6. Enter new values for Initial size and Maximum size. Click Set, and then click OK.
at 11:17 PM 0 comments Posted by roni schuetz
Labels: Windows
regions
#region Variable
#endregion
#region Properties
#endregion
#region Methods
#region Constructor
#endregion
#region Public Methods
#endregion
#region Private or Protected Methods
#endregion
#endregion
at 11:02 PM 0 comments Posted by roni schuetz
Windows 7: Indexing network share locations
I tried to index a network share from my filer but it doesn't appear in my search selections. After few researches I figured out there are 2 options:
Option 1:
a) right click on the share
b) select the choice: "Always Available offline"
Option 2:
a) Create a folder on your hard drive for shares.
-> i.e. c:\myshares
b) Create another folder in the above share:
-> c:\myshares\music
c) Link the Library to this folder.
d) Delete the folder.
e) Use the mklink in an elevated command prompt to make a symbolic link.
Name the link the same as the folder you created above.
i.e - mklink /d c:\myshares\music \\myshares\music
f) Done. Now you have non-indexed UNC path as a library.
enjoy
at 10:27 PM 0 comments Posted by roni schuetz
Labels: Desktop search, Windows 7
Thursday, May 06, 2010
Manage Custom Actions on Sharepoint
Today I searched over an hour until I found the following 3 Links regarding custom actions on Sharepoint.
- http://www.customware.net/repository/pages/viewpage.action?pageId=69173255
- http://www.customware.net/repository/pages/viewpage.action?pageId=69173259
- http://www.customware.net/repository/pages/viewpage.action?pageId=69173265
You find there an example for any custom action possibility which GroupId and Location you have to set for a CustomAction.
Thanks to them!
at 3:27 PM 0 comments Posted by roni schuetz
Labels: MOSS, sharepoint 2007
Tuesday, May 04, 2010
zip, zipper and 7zip log files
For various windows servers I had to decrease the amount of log files on the server itself. Therefore I decided the go with the following strategy:
- leave 3 days of log files as-is on the server
- files which are older then 3 day's need to be zipped and I keep a copy on the server
- files which are older then 60 day's can be deleted because they have already copied onto an additional storage (but this copy is a different szenario and I explain about it in a different blog post)
- 7-zip
- forfiles.exe (you can download a rar file from my shared cache website)
- Windows Task Scheduler
- one bat file
The following 3 lines you multiply as much as needed for each WsSVC[n] folder and save it into a *.bat file.
@echo START - %TIME%
forfiles -p "E:\LogFiles\W3SVC1" -s -m *.log -d -3 -c "Cmd /C 7z a @FILE.zip @FILE"
forfiles -p "E:\LogFiles\W3SVC1" -s -m *.log -d -3 -c "cmd /c del @file"
forfiles -p "E:\LogFiles\W3SVC1" -s -m *.zip -d -60 -c "cmd /c del @file"
@echo DONE - %TIME%
I had to compress 24 folder with log filesAll files together had a size 684 MB - well quite a lot for log files but they get very fast very big therefore I really suggest to implement this approach.
For my scripting convenience I've added the installation paths into environment path variable:
- Open System properties
- Click on Environment Variables
- Search for PATH and then click edit
- Add both full paths to 7zip and forfiles.exe
Now open your Task Scheduler and create a new Task:
Once you done with the configuration you should make a test run with your task.
The result is that I could save 636 MB on my hard disc on the live server. This space can be used for many other things