Tuesday, March 31, 2015

Microsoft applies 'freemium' tactic to mobile device management for Office 365

Fulfills 2014 pledge to bundle basic MDM tools with all commercial Office 365 subscriptions

Microsoft today made good on a promise from last fall, adding several basic mobile device management (MDM) tools to all commercial Office 365 subscriptions.

"With MDM for Office 365, you can manage access to Office 365 data across a diverse range of phones and tablets, including iOS, Android and Windows Phone devices," said Shobhit Sahay, a technical product manager with the Office 365 group, in a blog post Monday. "The built-in MDM features are included at no additional cost in all Office 365 commercial plans, including Business, Enterprise, EDU and Government plans."

Sahay's announcement fulfilled the pledge Microsoft made in October 2014, when the company said an MDM-specific upgrade would be released in the first quarter of 2015.

The free-of-charge tools now available allow Office 365 administrators to limit access to Office 365 corporate email and documents to company-managed devices; set device-level PIN locking; and wipe Office 365-related data from an employee's device, such as when they leave the organization and take their personal device with them.

Sahay steered enterprises that require additional features toward Microsoft Intune, a subset of the even more comprehensive Enterprise Mobility Suite. Intune adds support for Windows-powered PCs; covers other mobile apps, including line-of-business apps developed in-house, not just Office 365; and allows administrators to provision devices with additional security configurations like VPN.

An outline of the feature differences between the free MDM for Office 365 and Intune can be found on Microsoft's website. Intune costs $6 per user per month; Enterprise Mobility Suite runs $7.60 per user per month.

Microsoft's some-free-some-not approach to MDM meshes nicely with the broader "freemium" strategy that company executives have talked up recently, said Wes Miller, an analyst at Directions on Microsoft. "If your life revolves around Office 365, this is most definitely a freemium play," said Miller. "But this works only with Office 365 apps."

To assemble a comprehensive Microsoft-made MDM solution, then, organizations with non-Office 365 apps on their employees' devices -- whether home grown or purchased from other developers -- or who want to deal with PCs at the same time, will need to pony up for Intune.

Miller said Microsoft's goal is two-fold: First, to answer customers' requests for a way to manage the explosion of mobile apps that the firm has released in the last year for Android and iOS, and second, to give away "a taste of [MDM]" as a way to upsell enterprises on Intune.

"If you want to integrate [Office 365] with the rest of an infrastructure, you're also going to need to go to Intune," said Miller. "Once you buy into the whole Microsoft story [of things like OneDrive for Business, Azure App Service and Azure Active Directory], Intune starts to be more appealing."

Although the new MDM tools for Office 365 began rolling out today, it will be approximately four to six weeks before they reach all customers, said Sahay.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Tuesday, March 24, 2015

7 timeless lessons of programming ‘graybeards’

Heed the wisdom of your programming elders, or suffer the consequences of fundamentally flawed code

In one episode 1.06 of the HBO series "Silicon Valley," Richard, the founder of a startup, gets into a bind and turns for help to a boy who looks 13 or 14.

The boy genius takes one look at Richard and says, “I thought you’d be younger. What are you, 25?”

“26,” Richard replies.

“Yikes.”

The software industry venerates the young. If you have a family, you're too old to code. If you're pushing 30 or even 25, you're already over the hill.

Alas, the whippersnappers aren't always the best solution. While their brains are full of details about the latest, trendiest architectures, frameworks, and stacks, they lack fundamental experience with how software really works and doesn't. These experiences come only after many lost weeks of frustration borne of weird and inexplicable bugs.

Like the viewers of “Silicon Valley,” who by the end of episode 1.06 get the satisfaction of watching the boy genius crash and burn, many of us programming graybeards enjoy a wee bit of schadenfraude when those who have ignored us for being “past our prime” end up with a flaming pile of code simply because they didn’t listen to their programming elders.
ALSO ON NETWORK WORLD: How to lure tech talent with employee benefits, perks

In the spirit of sharing or to simply wag a wise finger at the young folks once again, here are several lessons that can't be learned by jumping on the latest hype train for a few weeks. They are known only to geezers who need two hexadecimal digits to write their age.
Memory matters

It wasn't so long ago that computer RAM was measured in megabytes not gigabytes. When I built my first computer (a Sol-20), it was measured in kilobytes. There were about 64 RAM chips on that board and each had about 18 pins. I don't recall the exact number, but I remember soldering every last one of them myself. When I messed up, I had to resolder until the memory test passed.

When you jump through hoops like that for RAM, you learn to treat it like gold. Kids today allocate RAM left and right. They leave pointers dangling and don't clean up their data structures because memory seems cheap. They know they click on a button and the hypervisor adds another 16GB to the cloud instance. Why should anyone programming today care about RAM when Amazon will rent you an instance with 244GB?

But there's always a limit to what the garbage collector will do, exactly as there's a limit to how many times a parent will clean up your room. You can allocate a big heap, but eventually you need to clean up the memory. If you're wasteful and run through RAM like tissues in flu season, the garbage collector could seize up grinding through that 244GB.

Then there's the danger of virtual memory. Your software will run 100 to 1,000 times slower if the computer runs out of RAM and starts swapping out to disk. Virtual memory is great in theory, but slower than sludge in practice. Programmers today need to recognize that RAM is still precious. If they don't, the software that runs quickly during development will slow to a crawl when the crowds show up. Your work simply won't scale. These days, everything is about being able to scale. Manage your memory before your software or service falls apart.

The marketing folks selling the cloud like to pretend the cloud is a kind of computing heaven where angels move data with a blink. If you want to store your data, they're ready to sell you a simple Web service that will provide permanent, backed-up storage and you won't need to ever worry about it.

They may be right in that you might not need to worry about it, but you'll certainly need to wait for it. All traffic in and out of computers takes time. Computer networks are drastically slower than the traffic between the CPU and the local disk drive.

Programming graybeards grew up in a time when the Internet didn't exist. FidoNet would route your message by dialing up another computer that might be closer to the destination. Your data would take days to make its way across the country, squawking and whistling through modems along the way. This painful experience taught them that the right solution is to perform as much computation as you can locally and write to a distant Web service only when everything is as small and final as possible. Today’s programmers can take a tip from these hard-earned lessons of the past by knowing, like the programming graybeards, that the promises of cloud storage are dangerous and should be avoided until the last possible millisecond.
Compilers have bugs

When things go haywire, the problem more often than not resides in our code. We forgot to initialize something, or we forgot to check for a null pointer. Whatever the specific reason, every programmer knows, when our software falls over, it’s our own dumb mistake -- period.

As it turns out, the most maddening errors aren’t our fault. Sometimes the blame lies squarely on the compiler or the interpreter. While compilers and interpreters are relatively stable, they're not perfect. The stability of today’s compilers and interpreters has been hard-earned. Unfortunately, taking this stability for granted has become the norm.

It's important to remember they too can be wrong and consider this when debugging the code. If you don't know it could be the compiler's fault, you can spend days or weeks pulling out your hair. Old programmers learned long ago that sometimes the best route for debugging an issue involves testing not our code but our tools. If you put implicit trust in the compiler and give no thought to the computations it is making to render your code, you can spend days or weeks pulling out your hair in search of a bug in your work that doesn’t exist. The young kids, alas, will learn this soon enough.

Long ago, I heard that IBM did a study on usability and found that people's minds will start to wander after 100 milliseconds. Is it true? I asked a search engine, but the Internet hung and I forgot to try again.

Anyone who ever used IBM's old green-screen apps hooked up to an IBM mainframe knows that IBM built its machines as if this 100-millisecond mind-wandering threshold was a fact hard-wired in our brains. They fretted over the I/O circuitry. When they sold the mainframes, they issued spec sheets that counted how many I/O channels were in the box, in the same way car manufacturers count cylinders in the engines. Sure, the machines crashed, exactly like modern ones, but when they ran smoothly, the data flew out of these channels directly to the users.

I have witnessed at least one programming whippersnapper defend a new AJAX-heavy project that was bogged down by too many JavaScript libraries and data flowing to the browser. It's not fair, they often retort, to compare their slow-as-sludge innovations with the old green-screen terminals that they have replaced. The rest of the company should stop complaining. After all, we have better graphics and more colors in our apps. It’s true -- the cool, CSS-enabled everything looks great, but users hate it because it’s slow.
The real Web is never as fast as the office network

Modern websites can be time pigs. It can often take several seconds for the megabytes of JavaScript libraries to arrive. Then the browser has to push these multilayered megabytes through a JIT compiler. If we could add up all of the time the world spends recompiling jQuery, it could be thousands or even millions of years.

This is an easy mistake for programmers who are in love with browser-based tools that employ AJAX everywhere. It all looks great in the demo at the office. After all, the server is usually on the desk back in the cubicle. Sometimes the "server" is running on localhost. Of course, the files arrive with the snap of a finger and everything looks great, even when the boss tests it from the corner office.

But the users on a DSL line or at the end of a cellular connection routed through an overloaded tower? They're still waiting for the libraries to arrive. When it doesn't arrive in a few milliseconds, they're off to some article on TMZ.

On one project, I ran into trouble with an issue exactly like Richard in "Silicon Valley" and I turned to someone below the drinking age who knew Greasemonkey backward and forward. He rewrote our code and sent it back. After reading through the changes, I realized he had made it look more elegant but the algorithmic complexity went from O(n) to O(n^2). He was sticking data in a list in order to match things. It looked pretty, but it would get very slow as n got large.

Algorithm complexity is one thing that college courses in computer science do well. Alas, many high school kids haven't picked this up while teaching themselves Ruby or CoffeeScript in a weekend. Complexity analysis may seem abstruse and theoretical, but it can make a big difference as projects scale. Everything looks great when n is small. Exactly as code can run quickly when there's enough memory, bad algorithms can look zippy in testing. But when the users multiply, it's a nightmare to wait on an algorithm that takes O(n^2) or, even worse, O(n^3).

When I asked our boy genius whether he meant to turn the matching process into a quadratic algorithm, he scratched his head. He wasn't sure what we were talking about. After we replaced his list with a hash table, all was well again. He's probably old enough to understand by now.
Libraries can suck

The people who write libraries don't always have your best interest at heart. They're trying to help, but they're often building something for the world, not your pesky little problem. They often end up building a Swiss Army knife that can handle many different versions of the problem, not something optimized for your issue. That's good engineering and great coding, but it can be slow.

If you're not paying attention, libraries can drag your code into a slow swamp and you won't even know it. I once had a young programmer mock my code because I wrote 10 lines to pick characters out of a string.

"I can do that with a regular expression and one line of code," he boasted. "Ten-to-one improvement." He didn't consider the way that his one line of code would parse and reparse that regular expression every single time it was called. He simply thought he was writing one line of code and I was writing 10.

Libraries and APIs can be great when used appropriately. But if they're used in the inner loops, they can have a devastating effect on speed and you won't know why.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Saturday, March 21, 2015

70-461 Querying Microsoft SQL Server 2012

QUESTION 1
You use Microsoft SQL Server 2012 to develop a database application. You create a table by using
the following definition:
CREATE TABLE Prices (
PriceId int IDENTITY(1,1) PRIMARY KEY,
ActualPrice NUMERIC(16,9),
PredictedPrice NUMERIC(16,9)
)
You need to create a computed column based on a user-defined function named udf_price_index.
You also need to ensure that the column supports an index. Which three Transact-SQL statements
should you use? (To answer, move the appropriate SQL statements from the list of statements to
the answer area and arrange them in the correct order.)
Build List and Reorder:



Answer:





QUESTION 2
You use Microsoft SQL Server 2012 to develop a database that has two tables named Div1Cust and
Div2Cust. Each table has columns named DivisionID and CustomerId . None of the rows in Div1Cust
exist in Div2Cust. You need to write a query that meets the following requirements:
* The rows in Div1Cust must be combined with the rows in Div2Cust.
* The result set must have columns named Division and Customer.
* Duplicates must be retained.
Which three Transact-SQL statements should you use? (To answer, move the appropriate
statements from the list of statements to the answer area and arrange them in the correct order.)
Build List and Reorder:



Answer:





QUESTION 3
You administer a Microsoft SQL Server 2012 database that contains a table named OrderDetail. You
discover that the NCI_OrderDetail_CustomerID non-clustered index is fragmented. You need to
reduce fragmentation. You need to achieve this goal without taking the index offline. Which
Transact-SQL batch should you use?

A. CREATE INDEX NCI_OrderDetail_CustomerID ON OrderDetail.CustomerID WITH DROP
EXISTING
B. ALTER INDEX NCI_OrderDetail_CustomerID ON OrderDetail.CustomerID REORGANIZE
C. ALTER INDEX ALL ON OrderDetail REBUILD
D. ALTER INDEX NCI_OrderDetail_CustomerID ON OrderDetail.CustomerID REBUILD

Answer: B


QUESTION 4
You develop a Microsoft SQL Server 2012 database. The database is used by two web applications
that access a table named Products. You want to create an object that will prevent the applications
from accessing the table directly while still providing access to the required data. You need to
ensure that the following requirements are met:
* Future modifications to the table definition will not affect the applications' ability to access
data.
* The new object can accommodate data retrieval and data modification.
* You need to achieve this goal by using the minimum amount of changes to the existing
applications.
What should you create for each application?

A. views
B. table partitions
C. table-valued functions
D. stored procedures

Answer: A


QUESTION 5
You develop a Microsoft SQL Server 2012 database. You need to create a batch process that meets
the following requirements:
* Returns a result set based on supplied parameters.
* Enables the returned result set to perform a join with a table.
Which object should you use?

A. Inline user-defined function
B. Stored procedure
C. Table-valued user-defined function
D. Scalar user-defined function

Answer: C

Monday, March 9, 2015

70-481 Essentials of Developing Windows Metro style Apps using HTML5 and JavaScript


QUESTION 1
You are preparing to write code that configures a CredentialPicker object. The code should allow
for platinum members to save their user credentials according to business authentication
prerequisites.
Which of the following is the property that should be included in your code?

A. The PreviousCredential property.
B. The AuthenticationProtocol property.
C. The CredentialSaveOption property.
D. The TargetName property.

Answer: C

Explanation:


QUESTION 2
You are preparing to write code that enforces the technical search capabilities requirements.
Which of the following is a method that should be included in your code?

A. The appendSearchSeparator method.
B. The appendResultSuggestion method.
C. The appendQuerySuggestions(suggestions) method.
D. The appendQuerySuggestion(text) method.

Answer: C

Explanation:


QUESTION 3
You have been instructed to make sure that customers and visitors are shown in keeping with the
prerequisites. You are preparing to write the necessary code.
Which of the following should be included in your code?

A. The CommitButtonText property of the ContactPicker class.
B. The SelectionMode property of the ContactPicker class.
C. The Email property of the ContactPicker class.
D. The DesiredFields property of the ContactPicker class.

Answer: D

Explanation:


QUESTION 4
You are preparing to write code to deal with adding and saving annotations according to the
technical product news updates prerequisites.
Which of the following should be included in the code?

A. You should consider making use of the onbeforenavigate navigation member.
B. You should consider making use of the onnavigated navigation member.
C. You should consider making use of the canGoForward navigation member.
D. You should consider making use of the canGoBack navigation member.

Answer: A

Explanation:


QUESTION 5
You are preparing to write code to satisfy the navigation business requirements.
Which of the following is the function that should be included in your code?

A. The navigate function.
B. The forward function.
C. The back function.
D. The addEventListener function.

Answer: A

Explanation: