Can any one help me with an answer pls.
I have my normal housekeeping fullbackup and transaction log backup ,running under agent job.
Now I have a requirement to make one Db under log shipping.So My question,
1)Does my existi
It’s fair to say that there has never been a bigger driver of network evolution than the cloud. The reason for this is the cloud is a fundamentally different kind of compute paradigm, as it enables applications, data and architecture changes to be done seemingly instantly. Cloud-native infrastructure is what enables mobile app developers to roll out new versions daily if they so choose.
The cloud is network-centricAnother fact about the cloud is that it is a network-centric compute model, so a poorly performing network leads to equally poorly performing applications. A lack of network agility means DevOps teams need to sit around twiddling their thumbs while network operations make changes to the network.
A step-by-step guide to creating a practice setup for the 70-462 exam using Azure virtual machines.
How to fix SQL Server disk I/O bottlenecks (without a hammer) In this new article, Simple-Talk editor Tony Davis explains step-by-step how to find and fix the root causes of disk I/O bottlenecks, including gathering data, avoiding knee-jerk fixes, and how monitoring tools can help. Read now. |
At its GPU Technology Conference this week, Nvidia took the wraps off a new DGX-2 system it claims is the first to offer multi-petaflop performance in a single server, thus greatly reducing the footprint to get to true high-performance computing (HPC).
DGX-2 comes just seven months after the DGX-1 was introduced, although it won’t ship until the third quarter. However, Nvidia claims it has 10 times the compute power as the previous generation thanks to twice the number of GPUs, much more memory per GPU, faster memory, and a faster GPU interconnect.
[ Learn how server disaggregation can boost data center efficiency. | Get regularly scheduled insights by signing up for Network World newsletters. ]The DGX-2 uses a Tesla V100 CPU, the top of the line for Nvidia’s HPC and artificial intelligence-based cards. With the DGX-2, it has doubled the on-board memory to 32GB. Nvidia claims the DGX-2 is the world’s first single physical server with enough computing power to deliver two petaflops, a level of performance usually delivered by hundreds of servers networked into clusters.
A short tutorial on scripting objects in SSMS.
Learn how you can move data from your Data Lake to Azure SQL Database with Azure Data Factory.
How to fix SQL Server disk I/O bottlenecks (without a hammer) In this new article, Simple-Talk editor Tony Davis explains step-by-step how to find and fix the root causes of disk I/O bottlenecks, including gathering data, avoiding knee-jerk fixes, and how monitoring tools can help. Read now. |
The growth in cloud computing has shone a spotlight on data centers, which already consume at least 7 percent of the global electricity supply and growing, according to some estimates. This has led the IT industry to search for ways of making infrastructure more efficient, including some efforts that attempt to rethink the way computers and data centers are built in the first place.
(Insider Story)In SQL Server 2016 the availability group automatic seeding functionality is implemented. This article will detail the steps to use of SSMS with automatic seeding and its limitations.
[Note: The author of this article is not a lawyer and this article should not be considered legal advice. Please consult a privacy specialist.]
The basic newsThe GDPR covers all personal data your company stores on data subjects in the EU – whether or not your company has nexus in the EU. Personal data is defined as data that can be used to identify a person. It’s similar to the concept of personally identifiable information (PII) that we have in the US, but it is broader. PII typically includes actual identifying elements like your name, social security number, and birthday, focusing mainly on the data required to fake your identity with a lender. Personal data includes what the US calls PII, plus any data that can be used to identify you in any way, which includes things as basic as an email address, online personality (e.g. twitter handle), or even the IP address where you transmitted a message from.
Microsoft has released a version of SQL Server 2017 for Linux.
IBM and Hewlett Packard Enterprise this week introduced new servers optimized for artificial intelligence, and the two had one thing in common: Nvidia technology.
HPE this week announced Gen10 of its HPE Apollo 6500 platform, running Intel Skylake processors and up to eight Pascal or Volta Nvidia GPUs connected by NVLink, Nvidia’s high-speed interconnect.
A fully loaded V100s server will get you 66 peak double-precision teraflops of performance, which HPE says is three times the performance of the previous generation.
The Apollo 6500 Gen10 platform is aimed at deep-learning workloads and traditional HPC use cases. The NVLink technology is up to 10 times faster than PCI Express Gen 3 interconnects.
You have times where you need to copy/paste something out of SSMS' grid view results pane that has a carriage return in it. Trying to copy/paste that data into Excel can be a headache and cause you to waste precious time reformatting. This just offers a tidbit of PowerShell code to help.
Because Microsoft has shifted to a more gradual upgrade of Windows Server, many of the features that will become available with Windows Server 2019 have already been in use in live corporate networks, and here are half a dozen of the best.
[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ] Enterprise-grade hyperconverged infrastructure (HCI)With the release of Windows Server 2019, Microsoft rolls up three years of updates for its HCI platform. That’s because the gradual upgrade schedule Microsoft now uses includes what it calls Semi-Annual Channel releases – incremental upgrades as they become available. Then every couple of years it creates a major release called the Long-Term Servicing Channel (LTSC) version that includes the upgrades from the preceding Semi-Annual Channel releases.
This article shows how DefaultBufferMaxRows and DefaultBufferSize properties can be used to improve dataflow task performance.
The query optimizer now treats EXISTS and IN the same way, whenever it can, so you’re unlikely to see any significant performance differences. Nevertheless, you need to be cautious when using the NOT IN operator if the subquery’s source data contains NULL values. If so, you should consider using a NOT EXISTS operator instead of NOT IN, or recast the statement as a left outer join.
I recently published a new blog post on SQLBI.
You can read it at this link: https://www.sqlbi.com/articles/capturing-power-bi-queries-using-dax-studio/.
Microsoft is set to make Windows Server 2019 generally available in the second half of the year, opening up access to its preview build through its Insiders program now and targeting data centers with new features to handle hybrid cloud setups and hyperconverged infrastructure.
The next version of Windows Server also adds new security features and enhances support for containers and Linux.
[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ]If you want to check out the release for yourself, sign up for the Insiders program.
As data centers are called upon to handle an explosion of unstructured data fed into a variety of cutting-edge applications, the future for FPGAs looks bright.
That’s because FPGAs, or field programmable gate arrays, are essentially chips that can be programmed, after manufacturing, to act as custom accelerators for workloads including machine-learning, complex data analysis, video encoding, and genomics – applications that have far-reaching consequences for communications, networking, health care, the entertainment industry and many other businesses.
[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ]Such applications lend themselves to parallel processing, an important feature of FPGAs, which can also be reconfigured on the fly to handle new features as the nature of these workloads evolve.
Last week Microsoft released the public preview of Azure SQL Database Managed Instances – an exciting new option for running SQL Server workloads in the cloud. This blog post explains what they are, and how Redgate's SQL Toolbelt supports them.
As data centers are called upon to handle an explosion of unstructured data fed into a variety of cutting-edge applications, the future for FPGAs looks bright.
That’s because FPGAs, or field programmable gate arrays, are essentially chips that can be programmed, after manufacturing, to act as custom accelerators for workloads including machine-learning, complex data analysis, video encoding, and genomics – applications that have far-reaching consequences for communications, networking, health care, the entertainment industry and many other businesses.
Such applications lend themselves to parallel processing, an important feature of FPGAs, which can also be reconfigured on the fly to handle new features as the nature of these workloads evolve.
When CEOs consider the benefits of DevOps and whether or not they should introduce it, their viewpoint will be influenced by their ongoing concerns with lowering costs and increasing revenues. As a result, factors like gaining a faster time to market and creating higher quality products will be top of the agenda. It’s a different story for CIOs because their focus is more on processes that can increase the throughput of the IT department, or how skilled IT staff can be recruited and retained.
What would your organization do if your cloud provider were to go out of business? What happens if your cloud provider suddenly stops offering critical services that your organization requires for its business to function properly? Businesses need to start asking these important questions and develop plans to address these scenarios.
The cloud is a new market that continues to grow, and there are more small players offering their services. According to Gartner, Cloud System Infrastructure Services (IaaS) are expected to grow from $45.8 billion in revenue in 2018 to $72.4 billion in 2020. As the market matures, it's only natural that some of these organizations will disappear or stop offering certain services. In 2013, Nirvanix stopped offering it cloud services and gave customers only two weeks’ notice to move their data off of their platform.
Several things make bare-metal cloud providers appealing compared with traditional cloud providers, which operate in a virtualized environment. Bare-metal providers give users more control, more access to hardware, more performance, and the ability to pick their own operating environment.
There's another interesting angle, as articulated by Martin Blythe, a research fellow with Gartner. He maintains that bare-metal providers appeal to small and mid-sized businesses (SMBs) because those companies are often small, local players, and SMBs looking for something more economical than hosting their own data center often want to keep the data center nearby.
Need to inspect your Data Lake objects and files? Want to automate the process? Find out how with PowerShell.
SQL Provision launched in January, offering users blazingly fast database copying, with a light storage footprint, centralized management, and the ability to mask any sensitive data, prior to distribution. This new release takes compliant provisioning one step further, by integrating data masking directly into SQL Clone’s image creation process, rather than running it as a separate step prior to the image creation.
The company I work for, Perion, chose Amazon Web Services as a main cloud provider for managing and operating our applications. These days I am learning how to manage databases and data related flows in the AWS cloud.
Which is the reason I have attended the AWS summit in Tel Aviv today.
Today’s AWS summit was the first out of the series of AWS summits held in major cities around the world. There were ~ 5000 attendees today, 25 technical sessions and it was sponsored by 33 AWS partners. Most sessions that I have attended today had a lot of sponsor content mixed with technical not-too-deep dive into various AWS services.
Sessions were divided into 5 tracks, led by AWS Solution Architects and Sponsors representatives:
I have really enjoyed the keynote by Dr. Werner Vogels, Amazon CTO. The keynote hall was already fully packed when I have arrived, and I had to watch the keynote video stream from the second hall. The sound was awesome but the person who was operating the camera wasn’t well at his job. Half of the slides could not be seen; the focus was all the time on the speaker.
Dr. Werner have talked about Amazon revenue run rate, which is near $20 billion dollars ( this forbes.com article says $16 billion, Microsoft has $16.7 billion, IBM $15.8 billion). AWS have millions of active customers and their pace of innovations (amount of the unique services per year) grows at very high pace:
Amazon for 7 consecutive years named as a Leader in Gartner’s Infrastructure as a Service (IaaS) Magic Quadrant.
Dr. Werner have talked about how infrastructure and software vendors used to tell us how to develop software. These days, in the cloud, we have as many tools as we need to make our development tailored to our needs.
Modern applications are continuously evolving. They have more than web content, they are using micro services architecture and are api-based. Everything is programmable, many have a smart connectivity, push notifications and responsive design. We used to have 1 deployment a week, now we have 50 deployments a day.
There were many awesome examples of the AWS customers during the keynote. One of them was the cloud infrastructure behind the iRobot:
(Image source : https://aws.amazon.com/solutions/case-studies/irobot/)
Dr. Werner have mentioned how Netflix are making use of Machine Learning to build smarter applications. 75% of the Netflix videos were viewed after the recommendation made by the recommendation engine. He has also mentioned the Pinterest, one of the world’s largest visual bookmarking tools, where all images go through Machine Learning process to get a better grip on visual search. Another cool examples were the boxing gloves that check the punch speed and the punch intensity, the helmet that analyses the pilot’s emotions and soccer and baseball video ingestion that uses Machine Learning to help coaches to understand how the team can do better at the next game.
Machine Learning can be used in many more use-cases:
Dr. Werner have talked about the modern data architectures and their 10 main characteristics: reliable data ingestion, preservation of source data, lifecycle management, metadata capture, security, self-service discovery, data quality, analytics, orchestration and capturing the data changes. He has mentioned that 60000+ databases already have been migrated into the AWS cloud from private datacenters. He has also talked about Aurora, the fast-growing database service. The Aurora data is being copied into 3 regions and has up and down auto-scaling capacity. For example, Expedia that has 4000 writes / 5000 reads per second into the Aurora database.
I will skip the keynote part where Dr. Werner have talked about rapid application development, serverless architecture, code services, kubernetes and application services – that is not my cup of tea.
There was an awesome demo of Trax, the company that uses innovative image recognition technology to help retail stores to sort out their shelves and are using Artificial Intelligence to unlock business opportunities. As a next stage of their solution, they are planning to use robots to take pictures inside stores which is very cool.
Another great demo was about Flytrax, first drone based food delivery service. Drones are super-fast, the can fly up to 60 km/h, affordable, not impacted by the traffic and can easily reach almost any location within a few minutes instead of driving for a few hours. Unfortunately, this service is not available in Israel yet.
Great keynote, #awssummit !
Yours,
Maria
Transform your query result into an Excel file using this technique.
Ava Robotics, a startup with strong technical ties to iRobot, just announced its telepresence robot. Her name is Ava, and she’s likely to win a lot of hearts.
For one thing, Ava is quite perceptive, using video technology from Cisco and integrating with Cisco Spark (which provides tools for team messaging, online meetings, and white boarding). Ava is also quite friendly. She allows her users to participate in remote meetings, wander down hallways at other facilities while chatting with colleagues, and enjoy face-to-face discussions with people who may physically be thousands of miles away.
Also read: Customer reviews: Top Remote Access ToolsTelepresence robots provide a lot of benefits to companies that are spread across many locations — especially those spanning continents — or with staff who work from home. They make work relationships considerably more productive — even for individuals who may have never met in person. Carrying on casual conversations and checking remote data centers and manufacturing facilities (sometimes safer than being there in person) can make huge differences in how staffs coordinate and get important work done.
Demo and script on how to script out SSIS Environments and their associated variables
Power BI Dashboards can be kept current using simple DAX formulas in your data models. Save yourself time and energy by understanding how dashboard tiles work on PowerBI.com and what can be done to set your dashboards to a current period.
The performance increase columnstore indexes grant when reading data from the index is offset by the expensive process required to build the index. In this Stairway level, Hugo Kornelis walks you through the steps SQL Server takes when building (or rebuilding) a columnstore index.
VMware has expanded its portfolio of cloud tools to help enterprises improve the manageability of their public cloud and on-premises environments. At the same time, VMware announced the first global expansion of VMware Cloud on AWS, its joint hybrid cloud service with Amazon Web Services.
Complexity is on the rise for enterprises as they expand their use of cloud computing – which often is not limited to a single cloud provider. VMware estimates that nearly two-thirds of companies will use two or more cloud service providers in addition to their on-premises data centers.
By harnessing the concept of agility to a methodology that enables constant software innovation, DevOps allows organizations to respond dynamically to changing market conditions and rising customer expectations.
Cisco this week expanded its Tetration Analytics system to let users quickly detect software vulnerabilities and more easily manage the security of the key components in their data centers.
Introduced in 2016, the Cisco Tetration Analytics system gathers information from hardware and software sensors and analyzes the information using big data analytics and machine learning to offer IT managers a deeper understanding of their data center resources. The idea behind Tetration includes the ability to dramatically improve enterprise security monitoring, simplify operational reliability and move along application migrations to Software Defined Networking.
Database administrators have enormous responsibility whether they manage one or hundreds of servers. Monica Rathbun tells us how she survived as the Lone DBA for 56 database servers for over a decade. While many DBAs work on teams instead of alone, she has great advice for all.
Tara Kizer outlines 15 telltale signs that you aren't a very good senior DBA.
You’ve been performance tuning queries and indexes for a few years, but lately, you’ve been running into problems you can’t explain. Could it be RESOURCE_SEMAPHORE, THREADPOOL, or lock escalation? These problems only pop up under heavy load or concurrency, so they’re very hard to detect in a development environment.
Good Morning Experts,
We are taking full backup every saturday at 9PM, differential backup daily at 9PM and transaction log backups every 30 minutes.
A developer accidentally deleted data from table at 2:4
Your organization’s culture of DevOps will often be the defining factor on whether you have a successful organizational change, or whether the changes that you implement impact pockets of your organization.
When a SQL Server database is operating smoothly and performing well, there is no need to be particularly aware of the transaction log, beyond ensuring that every database has an appropriate backup regime and restore plan in place. When things go wrong, however, a DBA's reputation depends on a deeper understanding of the transaction log, both what it does, and how it works.