Skip to content

IT Infrastructure related

Posted IT Infra related solutions and Informations.

Microsoft Windows 2008 SP2 Standard Edition Server – Installed windows patches fail to show up under "Installed Updates" list

October 6, 2010

In Windows 2008 standard server with SP2, sometime it will show you the updates which were installed but sometime it fails to show up mumtiple windows patches under “Installed Updates” section.

But, at a same time, if you use MBSA, it detects all these updates.

So, here we need to look at 2 log files to see if it shows any corruption.

%SYSTEMROOT%\Logs\CBS\CheckSUR.log
%SYSTEMROOT%\Logs\CBS\CheckSUR.persist.log
if you see the corruption errors like below,

then can follow the action plan as below.

1) On the server (from which these logs have seen), go to c:\windows\servicering\packages and check if the 12 files (6 .mum and 6.cat files). If these files are present then make a copy of these files (after taking their ownership) and then delete the files.
2) Once done, find a machine in which these updates are successfully installed. Copy these 12 files (6 .mum and 6.cat files) from that machine to this problem machine in c:\windows\servicering\packages. Then check if the updates show up in the list of “Installed Updates”. (May be reboot the server once after copying the files and then check).
if no success, check if server manager is able to discovering roles and features installed. If it throwing an error message like, HRESULT:0x800F0818 / HRESULT:0x800B0100, we need to look at server manager log in order to see why it occuring and does it goes off after reboot or not.

So, First we have to run the Microsoft Update Readiness Tool located here
Run the tool on server

  • Create the following directory structure : c:\windows\Checksur\Packages
  • In this packages directory (created in step 1), download and save the MSU packages for following updates (can be downloaded from Microsoft site):
examples:
        KB2079403
        KB980436
        KB981852
        KB982214
        KB982799
        KB983589 
  • Once all the MSU packages are copied, execute CheckSur again on the server. here running Checksur is nothing but running Microsoft Update readiness Tool. Once CheckSur completes, you can see that all installed windows patches will show under “Installed Updates” section.
This steps will resolve this patch not showing issue.
For Server Manager Error – you can click here for the resolution.

Dynamic Memory for Hyper-V in Windows 2008 Server R2 SP1

September 29, 2010
·  Microsoft’s approach to Dynamic Memory is fundamentally different than VMWare’s overcommitment, in that VMWare doesn’t trust information about memory usage from within the guest, whereas Microsoft’s implementation is based around an awareness of the amount and type of memory that’s being used at all times.

·  Dynamic Memory will work by Adding/Removing memory.
  • Adding memory is enabled through a new synthetic memory driver.
  • Removing memory that’s not being used is done with ballooning.
  • Memory is now assigned with a few new values:
    • Startup memory is the amount of memory assigned to a VM, which is also the minimum memory the VM will consume (default value is 512 MB).
    • Maximum memory limits how much memory a VM can consume.
    • Priority can be assigned to specific VMs in order to make sure that they receive available memory before other lower-priority VMs.
    • A Memory Buffer can be set to reserve memory for specific VMs, for instance if they need extra memory for file caching. 
·  Hyper-V Manager adds two new columns.
  • Current Memory identifies how much memory the VM is consuming.
  • Memory Availability identifies the difference between how much memory a VM has vs. wants in a +/-% figure.
    • When the availability goes negative, the Windows guest will start to work with the lesser amount of memory that’s now available to it (via paging, etc).
    • Negative availability will result in reduced performance, but the systems will continue to function.
·  Memory is now reserved for the root partition in a different way, so that dynamic memory won’t bring down the host.
  • This amount can be configured with a new registry key based on how the root partition is being used, for instance if it’s your desktop OS.
·  As Dynamic Memory is used more, the chances of spanning NUMA nodes increases (on NUMA systems).
  • He points out that different systems have vastly different Back Channel performance, so the impact of NUMA Spanning can be negligible or drastic.
  • In SP1, NUMA Spanning can be disabled (if desired). 
·  Dynamic Memory also supports Large Pages, which are likely to become more common with virtualised Exchange/SQL.

  • VMWare cannot overcommit these pages.

Hope you find this info useful, you can post more info here to share the knowledge.

Private Cloud Implementations

September 29, 2010

Today, 2 cloud computing services, Public and Private cloud. the diffrence between two are,
A public cloud is offered as a service, usually over an Internet connection. On the other hand, private clouds are deployed inside the firewall and managed by the user organization.

Private Cloud is a new approach in Cloud Computing with benifits,

  • Easily accessible and reliable utility service.
  • Along with qualities such as self-service, pay-as-you-go charges, on-demand provisioning and scalability.
  • Private clouds eliminate the ‘rewrite everything’ effect of public cloud computing.
  • They offer the trusted control that users seek while ensuring that IT meets the security, control, service levels and compliance requirements of the business. 

Within a Private Cloud, business and application owners can use the infrastructure as a standard service without having to bother about the complexities of servers, storage and networks. Organizations pay only for what they use, and are billed on a subscription (time-based) basis with little or no upfront cost. Private Clouds keep the benefits of cloud computing under the control of enterprise IT while increasing business opportunities for service providers to introduce choice into cloud services.

Virtualization decouples the physical IT infrastructure from the hosted applications and services, thus allowing greater efficiency and flexibility into processes. The virtualization of servers, storage and networks enables the mobility of applications and data—not just across servers and storage arrays in the same data center, but also across data centers and networks. Consolidation is a critical application of virtualization, enabling IT departments to regain control of distributed resources by creating shared pools of standardized resources which can be rationalized and centrally managed.

A few applications like email archiving, document management systems and data backup can be migrated on to the Private Cloud. As these models stabilize and organizations scale up, more business-critical applications can be moved on to the cloud. This helps organizations to deliver internal IT services more effectively in a cost-effective manner.

Remember below before implementing Private Cloud computing.

  • Evaluate current and planned hardware, hypervisors, network architecture and storage.
  • Understand corporate security standards and existing vendor relationships. Know where your vendors are going, so you don’t buy into dead-end technology.
  • Make sure essential content is available in a centralized library
  • Dynamically manage your IT policies by automating self-service provisioning of applications
  • Plan your service catalog wisely by creating reusable building blocks of virtual machines and services.
  • Your content is critical; take the time to know your users’ needs and plan for their experience.
  • Take the centralized view that is possible with a private cloud; avoid multiple operating systems.
  • Private Cloud architecture implementation

    The fabric of virtualization has its limits. For example, there is little flexibility in scaling a virtual machine. The scaling will be always limited to the maximum resources available on a particular physical server. For instance on a 24 core server with 32 GB RAM, you cannot have a virtual machine which has more than (or even equal to) 24 cores and 32 GB of RAM.

     
    However, if there are 50 such servers, then you technically have 1200 Cores and 1.6 TB of RAM. But if an organization needs a machine with higher specs from its virtual infrastructure, say 50 cores and 64 GB of RAM, it’s just not possible. The organization has to buy a new server with more than 50 cores and 50 GB of RAM. This can be used as a physical box or to create a virtual machine.
     

    Server architecture: This technology runs on the Hypervisor-based virtualization. Servers need Intel VT or AMD Pacifica features to run as part of the Private Cloud Architecture. So it’s essential; that you rule out old servers.

     
    Selection of cloud middleware: It’s best to opt for a Private Cloud Architecture Supported By your existing server virtualization platform. A XEN user can select between Ubuntu Enterprise Cloud or Enomaly. If you are a VMware fan, then sticking with VMware based private cloud architecture is a better option. Microsoft Hyper-V users will have to wait for the time being. Microsoft has promised to launch its Dynamic Datacenter Toolkit, which will let users build Private Cloud architecture based on Hyper-V.  
     

    This is the info I am having it as of now, will update here later with new upcoming news.

    How to configure schedule backup jobs to network share folder in Windows 2008 Standard Server

    September 24, 2010

    In Windows 2008 Standard Server, It is supported DVD media, External disk, attached dedicated backup disk or remote shared folder.
    We can configure automatic backup in 2 mode.
    1. Manual Backup
    2. Scheduled backup

    In windows 2008 standard edition, there is a limitation to store the backup to network (remote) shared folder. you can do a backup once but not a backup schedule. It need internal attached dedicated disk for backup but it is not supported to store backup in network shared folder through MMC (Windows backup console), you may get below error message.

    See the Backup Guide Here & Here & Here

    Cause:

    If you have a dedicated disk, you can schedule it through wbadmin enable command. Through standard edition, you cannot do it through mmc.

    See Here

    This is due to a Limitation in standard edition of windows 2008 Server, this enhanced capabilities comes with R2 editions.

    You can store both one-time (ad hoc) backups and scheduled backups on remote shared folders. (The capability to store scheduled backups on remote shared folders is new for Windows Server 2008 R2. (You can save scheduled backups to one or more attached disks (either internal or external)—or, new in Windows Server 2008 R2, you can save a scheduled backup to a volume or a remote shared folder.)

    See Here for details.

     
    Solution:

    You can archive the requirements by configuring schedule Backups to remote shared folder via command prompt by using WBADMIN command.

    1. Manual Backup:-

    Use below command in batch file and run backup once to remote share folder.

    Wbadmin start backup -backupTarget:\\[Server]\[Share name] -include:C: -vssFull –quiet

    2.  Automatic Schedule Backup to remote share folder:-

    Use same command specified below in batch file. For complete steps, please follow below.

    a. Create a Batch file with this command
    b. Change the BackupTarget parameter as appropiate    (file://server/share name)
    c. Copy this batch file to root of the drive
    d. Login to server as a administrator account
    e. Check and ensure that VSS, Block Level Backup service and Windows Backup services are running and set to Automatic modes.
    f. Open remote shared folder and check permission granted. Ensure that domain\administrator having full access and this task requires administrator groups or Backup Operator groups permission to execute.
    g. Check Windows Firewall status. It should be OFF state between source and destination servers. If your company strategy suggest to keep ON, ensure that windows backup is included and allowed under windows Firewall.
    h. Open a Task Scheduler  (by Run as Administrator)
    i. Select ‘ Create a Task’
    j. Give the name of the task, schedule it with date/time needed   (specify atleast 10 minutes delay until it finish this configuration)
    k. Switch to next tab, under start a program, select batch file from root drive
    l. Modify settings as needed

    Once done, Do not start the wizard, else it will throw an error code of 4294967294.

    “Task Scheduler successfully completed task “\Daily Backup” , instance “{38ff800e-a688-40a0-bc79-d0519f662fc9}” , action “C:\Windows\System32\wbadmin.exe” with return code 4294967294″

    This error code basically indicates the issue with permissions. You need to archive 2 things as spcfied below for the resolution.

    Once you configured Task Scheduler, make sure owner of the task scheduler should be Domain\Administrator account and you must have selected the option of ” Run with High Privilages”

    I hope it is very useful, Have a good time ahead…..!!!
    *****************************************************************************************************