Data Storage – IT Talk http://it-talk.org/ Tue, 27 Sep 2022 22:37:00 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.3 https://it-talk.org/wp-content/uploads/2021/06/icon-3-150x150.png Data Storage – IT Talk http://it-talk.org/ 32 32 SnapLogic Announces Partnership with Snowflake https://it-talk.org/snaplogic-announces-partnership-with-snowflake/ Tue, 27 Sep 2022 22:37:00 +0000 https://it-talk.org/snaplogic-announces-partnership-with-snowflake/ Snaplogic announced its partnership with Snowflake during Snowflake’s Global Data Cloud Tour. Since partnering with cloud computing company Snowflake in 2016, the two companies have worked closely to help joint customers gain deeper business insights from their growing amounts of data. SnapLogic gave joint customers an easier and faster way to move data into Snowflake’s […]]]>

Snaplogic announced its partnership with Snowflake during Snowflake’s Global Data Cloud Tour.

Since partnering with cloud computing company Snowflake in 2016, the two companies have worked closely to help joint customers gain deeper business insights from their growing amounts of data.

SnapLogic gave joint customers an easier and faster way to move data into Snowflake’s data cloud while helping them manage their data ecosystem. Customers can unite their siled data, discover and run various analytical workloads.

“At Snowflake, our goal is to help customers quickly and seamlessly leverage valuable insights from their growing amounts of data,” said Tarik Dwiek, Head of Technology Alliances, Snowflake.

“We look forward to seeing SnapLogic on our next Data Cloud World Tour, where they will give customers insight into the tools they need for an easy and efficient way to quickly move, load and analyze their data – further accelerating the return on investment of their cloud data initiatives.

For joint customers, SnapLogic provides the scalability needed to make the amount of data stored in Snowflake’s data cloud available and actionable to business decision makers.

Connectivity between SnapLogic’s connectors, Snaps, provides integrations into Snowflake’s platform through the company’s intuitive user interface and low-code platform.

“Achieving Premier Tier Partner status and participating in the Snowflake Data World Tour in APAC demonstrates to the region our commitment to delivering a modern, cloud-native iPaaS platform that enables joint customers to transform data into information in minutes,” says Uma Dubey. , Head of Channel and Alliances APAC, SnapLogic.

“We look forward to continuing to work with Snowflake to help our growing local customer base maximize ROI from their analytics, BI and data sharing projects.”

Snowflake customers leverage the data cloud to bring together all types of data to support a variety of deployment models while ensuring rapid, governed access to data at scale. Snowflake has added support for unstructured data to the Snowflake Data Cloud, with built-in capabilities to store, manage, govern, share, and process unstructured data with the same performance, concurrency, and scale as structured and semi-structured data. -structured.

Around 80% of global data is unstructured. Unstructured data contained in documents, emails, web pages, images, comments on blogs and social media sites, etc., can be valuable, making the processing of this type of data necessary to organizations that want to make data-driven decisions. While unstructured data is the largest in volume and growth, extracting the necessary information and preparing it for use is currently a manual process that requires technical expertise.

“SnapLogic is the only option for end-to-end data integration across your enterprise. Using an advanced AI solution, called Iris AI, we can reduce data integration development time by 50%. This allows you to gain faster visibility into advanced analytics and maximum value from your cloud data warehouse platform. The result: faster, better results with more immediate ROI,” Teresa, Director of Partner Marketing, SnapLogic.

]]>
No central storage of data, onus on hospitals to keep it safe: RS Sharma on Unique Health id | Latest India News https://it-talk.org/no-central-storage-of-data-onus-on-hospitals-to-keep-it-safe-rs-sharma-on-unique-health-id-latest-india-news/ Sun, 25 Sep 2022 18:53:40 +0000 https://it-talk.org/no-central-storage-of-data-onus-on-hospitals-to-keep-it-safe-rs-sharma-on-unique-health-id-latest-india-news/ There will be no central vault to store health data of patients who use Ayushman Bharat Health Account (ABHA) numbers, it will be the responsibility of affiliated hospitals to store and protect the data, said the head of the National Health Authority, RS Sharma, in an interview with HT. The unique ABHA number is generated […]]]>

There will be no central vault to store health data of patients who use Ayushman Bharat Health Account (ABHA) numbers, it will be the responsibility of affiliated hospitals to store and protect the data, said the head of the National Health Authority, RS Sharma, in an interview with HT.

The unique ABHA number is generated as part of the Ayushman Bharat Digital Mission (ABDM), which aims to develop the backbone needed to support the country’s integrated digital health infrastructure.

“It will bridge the existing gap between different stakeholders in the healthcare ecosystem through digital highways,” Sharma said. ABDM is expected to complete a year of its launch on September 27.

India’s health data management policy acts as a guiding document for the national digital health ecosystem, Sharma said. “It defines the minimum standards of data privacy protection, which must be met by ABDM participating entities. However, as ABDM follows the principle of federated architecture, responsibility for the storage of health data and their respective compliance with confidentiality and data protection standards are entirely the responsibility of healthcare professionals and institutions.

The “retention period of different types of data” is currently regulated by their respective laws, rules and regulations, Sharma said. “For example, the Prenatal Prenatal Diagnostic Techniques (Prohibition of Sex Selection) Acts 1994 mandate the retention of relevant health records for two years, or the IMC (Professional Conduct, Etiquette and Ethics) Regulations 2002 mandate keeping patients indoors records for a period of three years,” he said.

Changing these laws and regulations is beyond the purview of the health authority, Sharma said.

“As critical health data is widely digitized across the country following the Ayushman Bharat Digital Mission, the National Health Authority has recently released the Draft Health Data Management Policy version 2.0,” said Kazim Rizvi, founder of The Dialogue, a think tank. . “Following NHA data sharing guidelines, the digital health ecosystem has several safeguards against the violation of individuals’ privacy. The Data Management Policy grants data controllers the right to portability, access, confirmation and disclosure. It also clarifies that entities must also comply with any upcoming data protection regulations.

So far, over 243 million ABHA numbers have been created. Many people have discovered that their ABHA number was created without their consent.

In response to this, Sharma said people were asked during the Covid-19 pandemic vaccination campaign if they wanted to create an ABHA number. “In case a number has been created, it is the user’s choice whether to share it or not. The ABHA number only collects minimal data – name, gender, age and means of communication.

At the same time, he stressed that privacy is a fundamental right. It will be up to the user whether or not to share the data, Sharma said.

“There is a consent manager built into the system. The patient must agree to share the data, which is equivalent to agreeing to share physical records. This requires their explicit consent,” he said. “The NHA will ensure minimal data collection.”

“It will also be interesting to examine how consent managers are regulated within the healthcare ecosystem, in tandem with the account aggregators used by the Reserve Bank of India,” Rizvi said. “Consent aggregators will be a crucial part of the dashboards that citizens will have access to.”

Talking about the digital ecosystem that the ABDM aims to create, Sharma said that it will generate a register of health providers, professionals, medicines and nurses. “It will be the cornerstone of a digital health system,” he said.

Referring to data sharing with decision makers, Sharma said only anonymized metadata will be shared. “Metadata or aggregated anonymized data will help identify, verify and deliver targeted health services. For example, if the sale of paracetamol increases in an area, it is possible that many people suffer from fever. In this case, health services can be better administered in these areas,” he said. “An individual’s privacy will not be compromised.”

Regarding the next project, the Health Stack, Sharma said it will aim to create an interoperable and seamless health experience. “Once the registry, which contains five items, is completed, it will be easier for users to use health services,” he added.

]]>
IEEE 2883 data sanitization standard is a path to storage reuse and recycling https://it-talk.org/ieee-2883-data-sanitization-standard-is-a-path-to-storage-reuse-and-recycling/ Fri, 23 Sep 2022 19:46:09 +0000 https://it-talk.org/ieee-2883-data-sanitization-standard-is-a-path-to-storage-reuse-and-recycling/ hard disk cleanup Getty At the SNIA 2022 Storage Developers Conference, there were many parallel discussions on a new IEEE standard, 2883, on methods for sanitizing logical storage and physical storage, as well as providing requirements and technology-specific guidance for disposal of recorded data. This is an effort to update the data sanitization standards, officially […]]]>

At the SNIA 2022 Storage Developers Conference, there were many parallel discussions on a new IEEE standard, 2883, on methods for sanitizing logical storage and physical storage, as well as providing requirements and technology-specific guidance for disposal of recorded data. This is an effort to update the data sanitization standards, officially developed by NIST (NIST SP800-88R1).

Disposal of stored data is an important consideration when retiring or reusing storage devices and systems and deserves some attention. Greater reuse of old storage devices can extend their lifespan, prevent destruction of storage devices, allow recovery of valuable components, and thus reduce the demand for resources needed to manufacture new storage devices, leading to a more circular economy. Much of the content in this article is taken from an OCP data sanitization white paper from July 2022 and the IEEE 2883 standard.

Companies storing data in the cloud need to ensure that their customers’ data is secure. It is common for these companies to physically destroy devices containing data such as hard drives and solid-state drives, despite the use of advanced encryption and security features on these devices that can ensure near-zero risk of data leakage. This physical destruction includes the punching and shredding of these devices. Such physical destruction makes it economically impossible to recover sub-components, such as rare-earth magnets from hard drives.

Extended use of storage devices and increased recovery of valuable end-of-life components can lead to reduced carbon emissions. An ideal circular economy uses reuse, sharing, repair, refurbish, refurbish and recycle to create a closed-loop system that minimizes the use of new materials and reduces the creation of waste, pollution and carbon emissions. Disinfecting media on storage devices can securely prevent access to data and avoid physical destruction. Sanitation has a special meaning. It is a process or method for making access to target data on a storage medium impossible for a given level of effort.

The IEEE P2883 Standard for Storage Sanitization details sanitization methods and techniques for various storage media (HDD, SSD, optical, removable, etc.). It specifies interface-specific techniques (SATA, SAS, NVMe). It aligns the industry with modern media cleaning terminology and techniques and targets all logical and physical locations of data, including user data, old data, metadata, over-provisioning, and more. The three basic cleaning methods are shown below.

Clear uses logical techniques on user data on all addressable storage locations for protection against simple non-invasive data recovery techniques using the same host interface available to the user. Destruction basically turns the storage device into slag. Purging is the most attractive approach to reusing storage devices. There are three purge methods, which can be used together to reduce the likelihood of recovering data, although any one method is sufficient to demonstrate data recovery using state-of-the-art lab data recovery techniques .

These three methods are:

1) Sanitize Purge Cryptographic Erase (CE) will change the media encryption key on a device, typically today using AES256, which is not only a secure way to sanitize a device, but also happens in seconds

2) Sanitize Purge Overwrite Safely overwrites storage media with various patterns which can be checked later. Overwrite can be used with hard drives that do not support CE

3) Sanitize Purge Block Clear can zero erase blocks on NAND-based SSDs and can be used in conjunction with CE

Note that for a hard drive, the cleanup purge overwrite takes about an hour per terabyte to complete on a modern hard drive. This leaves the hard drive with no recoverable user data.

The IEEE 2883 data sanitization standard defines methods for securely erasing data from storage devices, thereby preventing unauthorized access to data. Use of this standard enables reuse and recycling of various digital storage devices and can contribute to a circular economy in digital storage devices and systems and lower carbon emissions.

]]>
Huawei published the white paper “Data Storage Power https://it-talk.org/huawei-published-the-white-paper-data-storage-power/ Wed, 21 Sep 2022 14:01:00 +0000 https://it-talk.org/huawei-published-the-white-paper-data-storage-power/ BANGKOK, September 21, 2022 /PRNewswire/ — Today at Huawei Connect 2022, Huawei officially released its white paper “Data Storage Power – The Digital Cornerstone of High-Quality Society Development”. The white paper defines quantitative indicators to measure data storage capacities and analyzes the current data storage landscape in different regions of the world. It aims to […]]]>

BANGKOK, September 21, 2022 /PRNewswire/ — Today at Huawei Connect 2022, Huawei officially released its white paper “Data Storage Power – The Digital Cornerstone of High-Quality Society Development”. The white paper defines quantitative indicators to measure data storage capacities and analyzes the current data storage landscape in different regions of the world. It aims to help governments and businesses better assess, design and develop data storage capabilities.

Gu Xuejun, Vice President of Huawei IT Product Line, said, “Data storage capacity is currently measured by capacity. However, with the rapid development of the industry and the emergence of new diversified data services such as AI and Big Data, the ability alone is not enough to measure the future development and construction of data systems. storage. We need a more scientific definition and evaluation system to effectively measure data storage capacities.

The smart world is driving explosive data growth in all industries, and the digital transformation of these industries requires powerful data storage capabilities or data storage power. The white paper explains the concept of data storage power and provides:

A concept and connotation of data storage power: Data storage power is an all-encompassing concept that includes storage capacity (the core), performance, reliability, and ecology.

Quantitative research into the value of data storage: Huawei’s calculations show that a data storage investment of $1 contributes to a direct value of $5an indirect value of $8and an induced value of $30-40.

A system of indicators that evaluates the data storage power of an area or a data center: This system includes 35 indicators at three levels in four directions – scale, efficiency, preparatory work and progress – based on the characteristics of countries and companies.

An assessment of storage power in 20 countries and regions: The white paper analyzes why some countries top the rankings for data storage power and how those that rank lower can catch up. It also provides policy suggestions to improve data storage power.

Gu Xuejun said, “I view this white paper as a meaningful exploration that will spark more interest in promoting the development of the data storage industry. Only when data is well stored, quickly computed, and stably transmitted through networks can digital infrastructure unlock the value of data and better promote quality economic and social development.”

For more information, please Click on: https://e.huawei.com/topic/data-storage-power-mega/en/

SOURCEHuawei

]]>
Don’t lose your data! Migrate to OneDrive by October 1 | FIU News https://it-talk.org/dont-lose-your-data-migrate-to-onedrive-by-october-1-fiu-news/ Mon, 19 Sep 2022 21:02:49 +0000 https://it-talk.org/dont-lose-your-data-migrate-to-onedrive-by-october-1-fiu-news/ Student and alumni emails have already moved from FIU’s Google accounts to Microsoft 365, a migration led by the Information Technology Division. The transition enenables the university to operate more efficiently; provides enhanced security to mitigate spam, phishing and other attacks; and gives the Panthers access to the latest technology. But the process is not […]]]>

Student and alumni emails have already moved from FIU’s Google accounts to Microsoft 365, a migration led by the Information Technology Division. The transition enenables the university to operate more efficiently; provides enhanced security to mitigate spam, phishing and other attacks; and gives the Panthers access to the latest technology.

But the process is not over. If you want to keep the files and documents currently located in your Google Drive, you have a role to play in the migration.

To keep these files, you need to move your Google Drive content to OneDrive (or other digital storage of your choice).

The last day you can access your CRF Google Drive – and the last day to migrate your content – is Saturday, October 1.

You won’t be able to access your files after that day, so be sure to mark your calendar and save any content you like.

The Panthers will have 1.5TB of storage on OneDrive to migrate their content. In addition, alumni will only have access to OneDrive until January 1, 2024. We strongly encourage alumni to find a more permanent location for their data as the university is currently unable to extend access to OneDrive beyond this period. Microsoft will offer a discounted Microsoft 365 subscription to CRF graduates after this date; visit the Microsoft 365 website for more information.

Ways to migrate your data:

Mover is Microsoft’s tool for exporting/importing content. See a Step by step guide. Attention, this tool is only used to pass from GDrive to OneDrive.

Google Takeout is Google’s tool for exporting content from your Google account. Learn more with IT how-to guide for google takeout.

The DIY Options is to download the content from Google and then upload it to OneDrive.

If you have any questions, the Computing Division will organize a OneDrive-focused virtual training for students on Wednesday, September 21.

Learn more about the transition to Microsoft 365 on the Computer Science Division Resource Page.

]]>
Voting machine data breaches worry midterm candidates https://it-talk.org/voting-machine-data-breaches-worry-midterm-candidates/ Fri, 16 Sep 2022 22:21:00 +0000 https://it-talk.org/voting-machine-data-breaches-worry-midterm-candidates/ ATLANTA (AP) — Sensitive voting system passwords posted online. Copies of the confidential voting software can be downloaded. Skinning machines inspected by people not supposed to have access to them. The list of alleged security breaches at local election offices since the 2020 election continues to grow, with investigations underway in at least three states […]]]>

ATLANTA (AP) — Sensitive voting system passwords posted online. Copies of the confidential voting software can be downloaded. Skinning machines inspected by people not supposed to have access to them.

The list of alleged security breaches at local election offices since the 2020 election continues to grow, with investigations underway in at least three states – Colorado, Georgia and Michigan. The stakes seemed to rise this week when the existence of a federal investigation involving a prominent loyalist of former President Donald Trump who promoted conspiracy theories about voting machines across the country.

While much remains unknown about the investigations, one of the most pressing questions is what all of this could mean for the safety of voting machines with the midterm elections less than two months away.

Election security experts say the breaches by themselves did not necessarily increase threats to the November vote. Election officials already assume that hostile foreign governments could hold the sensitive data, so they are taking precautions to protect their voting systems.

The most immediate concern is the possibility that rogue election workersincluding those who sympathize with lies about the 2020 presidential election, could use their access to election materials and the knowledge gained from the breaches to launch an attack from the inside. This could be intended to gain an advantage for the desired candidate or party, or to introduce system problems that would further sow distrust in the election results.

In some of the alleged security breaches, authorities are investigating whether local officials provided unauthorized access to people who copied software and hard drive data and, in several cases, shared it publicly.

After the breach in georgia, a group of election security experts said the unauthorized copying and sharing of rural Coffee County election data posed “serious threats” to the November election. They urged the state electoral board to replace touchscreen devices used statewide and use only hand-marked paper ballots.

A 2020 election voting machine was somehow sold by a Goodwill store. (Source: CNN/KVRR/GETTY/GOODWILL)

Harri Hursti, a leading vote security expert, expressed concern about another use of hacked data. Access to voting equipment data or software can be used to develop a realistic video in which someone claims to have manipulated a voting system, he said.

Such a fake video posted online or on social media on or after Election Day could create chaos for an election office and cause voters to challenge the accuracy of the results.

“If you have these rogue images, now you can start fabricating compelling fake evidence — fake evidence of wrongdoing that never happened,” Hursti said. “You can start creating very compelling imaginary evidence.”

There is no indication that the voting machines were tampered with, either during the 2020 election or in this year’s primaries. But conspiracy theories widely promoted among some conservatives have led to calls to replace machines with ballots marked and counted by hand and raised concerns that they could be targeted by people working in electoral offices or at polling stations.

The alleged offenses appear to be orchestrated or encouraged by people who falsely claiming that the 2020 election was stolen of Trump. In several of the cases, employees of local election offices or election commissions gave access to the voting systems to people who were not authorized to do so. The incidents came into public view after Mesa County, Colorado’s voting system passwords were posted online, sparking a local investigation and successful effort. to replace the county clerk to supervise the elections.

MyPillow CEO Mike Lindell, who has hosted or attended forums around US voting machine conspiracy theories, said this week he received a subpoena from a federal grand jury investigating the violation in Colorado and that he had been ordered to turn over his cell phone to the FBI. officers who approached him at a fast food joint in Minnesota.

“And they told me not to tell anyone,” Lindell said in a video afterwards. “Okay, I won’t. But I am.”

Lindell and others have traveled the country over the past year, hosting events where attendees are told that voting machines have been tampered with, officials are being “selected” rather than elected, and widespread fraud. cost Trump the 2020 election.

In an interview with the Minneapolis Star Tribune, Lindell said FBI agents questioned him about the Colorado breach and Dominion Voting Systems. The company provides voting equipment used in about 30 states and its machines have been targeted in breaches in Colorado, Georgia and Michigan.

When agents asked him why he was flying between different states, Linden told them, “I go to attorneys general and politicians, and I try to get them to get rid of these voting machines in our country.”

The Justice Department did not respond when asked for details of its investigation.

Dominion sued Lindell and others, accusing them of defamation. In a statement this week, the company said it would not comment on ongoing investigations, but said its systems were secure. He noted that no credible evidence has been provided to show that his machines “did anything other than accurately and reliably count votes in all states.”

The scope of the federal grand jury’s investigation in Colorado is not known, but local authorities accused Mesa County Clerk Tina Peters in what they described as a “deceptive scheme designed to sway officials, violate security protocols, override authorized access to voting materials, and trigger potential distribution of confidential information to unauthorized individuals.”

Peters pleaded not guilty and said she had the authority to investigate concerns that voting materials had been tampered with. She has appeared at numerous events with Lindell over the past year, including Lindell’s “cybersymposium” last August at which a digital copy of the Mesa County Election Management System was distributed.

David Becker, a former attorney for the US Department of Justice who now directs the Center for Election Innovation & Research, notes the irony of those raising the alarm about the involvement of voting equipment in alleged violations of the same systems.

“People who have attacked the integrity of elections are destroying the real integrity of elections,” he said.

___

Associated Press writer Michael Balsamo contributed to this report.

___

Follow AP voting coverage at: https://apnews.com/hub/vote

]]>
Mudge’s testimony highlights a problem Twitter, Facebook: where is your data? https://it-talk.org/mudges-testimony-highlights-a-problem-twitter-facebook-where-is-your-data/ Thu, 15 Sep 2022 14:00:00 +0000 https://it-talk.org/mudges-testimony-highlights-a-problem-twitter-facebook-where-is-your-data/ During a congressional hearing on Tuesday, Twitter whistleblower Peiter Zatko was repeatedly questioned whether Twitter was aware of how its user data is accessed and stored. On several occasions, he gave an awkward answer: the company doesn’t know. The problem, however, extends far beyond Twitter, according to a range of engineers and experts in Silicon […]]]>

During a congressional hearing on Tuesday, Twitter whistleblower Peiter Zatko was repeatedly questioned whether Twitter was aware of how its user data is accessed and stored.

On several occasions, he gave an awkward answer: the company doesn’t know.

The problem, however, extends far beyond Twitter, according to a range of engineers and experts in Silicon Valley. In a recent court hearing, for example, a senior Meta engineer also struggled to provide answers to questions about how Facebook pulls together all the information it collects about its billions of users.

“I would be surprised if there was even one person who could conclusively answer this narrow question,” the engineer said, in an exchange of court testimony that was reported for the first time by interception. Facebook provided the court with a list of 55 systems and databases where user data could be stored.

Tech giants like Google, Facebook, and Twitter were founded more than 15 years ago and developed freewheeling cultures where engineers and teams could create databases, algorithms, and other software. independently of each other. Speed ​​was favored over security measures that might slow things down. That was before years of privacy lawsuits and legislation pushed companies to tighten their data practices.

But experts said companies are still struggling to pay off years of technical debt as regulators and consumers demand more from tech companies, such as the ability to delete data or know exactly what is being collected about a person. . And some of those speed-first practices haven’t changed.

Twitter whistleblower says security breaches cause ‘real harm to real people’

“Many Twitter engineers took the position that security measures were making life difficult for them and slowing people down,” said Edwin Chen, who has held engineering positions at Twitter, Google and Facebook and is now CEO of the company. content moderation start-up Surge AI. . “And that’s certainly a bigger issue than just Twitter.”

Some of these systems are black boxes even for those who built them, said Katie Harbath, former Facebook policy director and CEO of consultancy Anchor Change (Facebook changed its name to Meta last year). Even if the right policies are in place, they can be difficult to implement when the underlying databases have not been designed to answer questions such as what are all the places where the location or profile of a person could have been stored.

“It’s hard to start from scratch, especially as you grow up,” she said. “The way these rigs were originally set up, each team had enormous autonomy.”

In the Meta court case, a Northern California class action lawsuit over the Cambridge Analytica privacy scandal that the company settled last month, plaintiffs demanded that the company show them the full information she collects and stores about them. This could include people’s precise locations throughout the day, health conditions they researched or groups they joined, and inferences such as the likelihood that a person was married.

Facebook initially offered data from the company’s “Upload Your Information” tool, but a judge found in 2020 that the information provided by Facebook was too limited. Yet Facebook’s response, recorded in a deposition this summer, was essentially that even the companies’ own engineers didn’t know where all the data was.

Dina El-Kassaby, a spokeswoman for Meta, Facebook’s parent company, said the deposition did not mean the company was failing on security or data access. “Our systems are sophisticated and it should come as no surprise that no engineer in the company can answer every question about where every user information is stored,” she said. “We have one of the most comprehensive privacy programs in place to oversee data usage across our operations and to carefully manage and protect people’s data. We have made – and continue to make – significant investments to meet our privacy commitments and obligations, including extensive data controls.

Ex-security chief says Twitter buried ‘glaring loopholes’

During Tuesday’s Senate hearing with Zatko, the whistleblower and former security chief made similar comments about Twitter. He noted that in a recent data breach, Twitter accidentally leaked the personal information of 50 million employees (Zatko’s attorney later issued a corrective statement saying Zatko meant 20,000).

Zatko noted during the hearing that Twitter has nothing that comes close to that many employees — the current number is 7,000 — and pointed out that Twitter keeps too much information about former employees and contractors that it does. fails to delete.

He repeatedly claimed that the company had up to 4,000 engineers – more than half of all company employees – with wide access to internal systems and few ways to officially track who accessed what. . It was a dangerous situation, he said, because an individual employee could take control of a Twitter account and impersonate it.

If that employee was secretly working for a foreign government, the risks of giving employees wide latitude to access user data are much greater. Zatko alleged that Twitter knowingly had employees who worked for both the Indian and Chinese governments, but did not provide evidence to support those claims.

And in a separate report on the company’s ability to combat misinformation that was in the treasury provided by Zatko to Congress, an independent auditor noted that Twitter lacks a formal system to track user cases. who violated company rules.

Twitter repeatedly pushed back against Zatko’s arguments. A spokeswoman, Rebecca Hahn, previously told The Washington Post that Twitter had tightened intensive security since 2020, that its security practices are in line with industry standards, and that it had specific rules about who can access company systems. In response to Tuesday’s hearing, Hahn reiterated that Zatko’s arguments were “riddled with inconsistencies and inaccuracies,” but declined to elaborate on specifics.

Twitter can’t afford to be one of the most influential websites in the world

David Thiel, technical director of the Stanford Internet Observatory at Stanford University and a former Facebook security engineer, said that after reading Zatko’s disclosures, he felt that Twitter’s security processes seemed to be years behind those of Facebook. He noted that Facebook had significantly tightened access in response to various controversies over the years, including the allegation that Facebook allowed the company Cambridge Analytica to access user data, to the point that if an engineer accessing a system, he did not have permission to access, “someone will come after you and you will be fired.”

But he said it’s still common in Silicon Valley to give engineers wide access so they can “build great products quickly.”

“The focus,” he said, “is always on speed and access.”

He said sometimes companies, including Facebook, really can’t know everything that’s inside their systems.

For example, machine learning systems and software algorithms are made up of tens of thousands of data points, often calculated instantaneously. Although it is possible to put data points into the system, one cannot then go back to retrieve the original entries. He drew a food analogy, noting that it would be impossible to reprocess the soup back to its original ingredients.

But other data, he said, is simply complex, and companies are resisting the extensive work it would take to track it all down — and would likely only do so if compelled by new laws or court rulings.

It’s not “so complicated that it’s not doable,” he said.

]]>
Virtual Data Storage Market Opportunities https://it-talk.org/virtual-data-storage-market-opportunities/ Tue, 13 Sep 2022 20:43:00 +0000 https://it-talk.org/virtual-data-storage-market-opportunities/ MarketsandResearch.biz just gave Global virtual data storage market from 2022 to 2028, which outlines future market development, openings, and existing elements of the Virtual Data Storage business. Organizations that are trying to ship an item or increase their range in the virtual data storage market will find exploration significantly. It will also benefit suppliers and […]]]>

MarketsandResearch.biz just gave Global virtual data storage market from 2022 to 2028, which outlines future market development, openings, and existing elements of the Virtual Data Storage business. Organizations that are trying to ship an item or increase their range in the virtual data storage market will find exploration significantly. It will also benefit suppliers and customers of other connected businesses.

The review offers provincial experiences in the virtual data storage market, which have further been partitioned nationally to give associations a complete picture. In the area of ​​the corporate profile, unique consideration has been given to significant partners. Monetary income, topographical presence, business outline, things sold, and important steps taken by players to stay ahead of the opposition are recalled for this part.

DOWNLOAD A FREE SAMPLE REPORT: https://www.marketsandresearch.biz/sample-request/275205

This report contains a detailed future investigation of the company. With the consistent spectacle of engine manufacturers, item classifications, and end-customer affiliations, the review explains the economic status of major areas and the nuances of conjectures. In addition, we give an exhaustive meaning to the central parts.

Product types covered in the report include:

Application types covered in the report include:

  • BFSI
  • Company
  • Media
  • Other

Key Players Covered in the Market Report are:

  • RR Donnelley
  • Drooms GmbH
  • CapLinked
  • Vault room
  • Merrill Company
  • Intralinks Titles
  • HighQ Solutions Limited

The countries covered in the market report are:

  • North America (United States, Canada and Mexico)
  • Europe (Germany, France, UK, Russia, Italy and Rest of Europe)
  • Asia-Pacific (China, Japan, Korea, India, Southeast Asia and Australia)
  • South America (Brazil, Argentina, Colombia and rest of South America)
  • Middle East and Africa (Saudi Arabia, United Arab Emirates, Egypt, South Africa and Rest of Middle East and Africa)

ACCESS THE FULL REPORT: https://www.marketsandresearch.biz/report/275205/global-virtual-data-storage-market-2022-by-company-regions-type-and-application-forecast-to-2028

The entire market is represented, with unusual emphasis on scope, creation, esteem, misfortune/profit, supply/demand and import/issue. It also contains data on essential associations. The development of the organization will benefit from an available part dependent on the top-down examination. A SWOT examination, a speculation feasibility study and a venture capital return survey are also retained for this exploration. For better modern organization, targeted information such as flow designs, openings, conductors, limitations, and measurements are obtained from reliable sources.

]]>
HPC, Data Analytics, Storage and Management Market Size, https://it-talk.org/hpc-data-analytics-storage-and-management-market-size/ Fri, 09 Sep 2022 13:00:00 +0000 https://it-talk.org/hpc-data-analytics-storage-and-management-market-size/ Pune, Sep 09, 2022 (GLOBE NEWSWIRE) — The global HPC, analytics, storage and data management market will reach USD 24.98 billion in 2022 and is expected to grow at a CAGR of 19.43% on the forecast period 2023 to 2032, according to a recent global market research by Overview of the quince market. With HPC, […]]]>

Pune, Sep 09, 2022 (GLOBE NEWSWIRE) — The global HPC, analytics, storage and data management market will reach USD 24.98 billion in 2022 and is expected to grow at a CAGR of 19.43% on the forecast period 2023 to 2032, according to a recent global market research by Overview of the quince market.

With HPC, users can process massive amounts of data faster than with a conventional computer, delivering insights faster and helping organizations stay ahead of the competition. The power of HPC solutions could eventually exceed that of the most powerful laptops. This capacity enables the use of up to terabytes (TB) of data, millions of scenarios and other large analytical calculations.

Get a sample copy of this report @ https://www.quincemarketinsights.com/request-sample-87693

Thanks to technologies such as the Internet of Things (IoT), artificial intelligence (AI) and 3D imaging, the number and volume of data that organizations have to work with is increasing rapidly. For a variety of jobs, such as sports event live broadcast, storm development monitoring, new product testing and market trend analysis, the ability to analyze data in real time is crucial.

Drivers:

  • Growing demand for computing in healthcare
  • Growing demand for automated trading and tracking current stock patterns

Retains:

  • High-speed data transfer between compute servers and data storage must be supported by network components.

Opportunity:

  • Growing installation of platforms capable of securely analyzing a huge database of private health data to accelerate the search for treatments, vaccinations and disease mutations.

Challenges:

  • Requires reliable IT infrastructure to process, store and analyze huge volumes of data.

Impact of COVID 19 on the HPC, Data Analytics, Storage and Management Market

  • University of British Columbia scholars are working on a variety of COVID-19-related projects, many of which require access to high-performance computing.
  • The Center for Genome Research and Biocomputing at Oregon State University has a long history of using AMD systems. The Monarch Initiative’s use of Knowledge Graphs and the State of Oregon’s TRACE program, which monitors changes in SARS-CoV-2, was made possible by the servers of the COVID-19 HPC Fund.

HPC, data analytics, storage and management market, by products and services

On the basis of products and services, the HPC, data analytics, storage and management market is divided into data analytics software and workbenches, data analytics services, storage, management and cloud computing solutions.

Software and workshops are expected to grow over the forecast period.

HPC, Data Analytics, Storage and Management Market, By Application

Based on Application, the HPC, Data Analysis, Storage and Management market is segmented into Next Generation Sequencing, Microscopy, Chromatography, Flow Cytometry, Spectroscopy, Others

The next generation sequencing segment is expected to drive the market growth at a significant CAGR during the forecast period.

HPC, Data Analytics, Storage and Management Market, By End User

Based on the end-user, the HPC, data analytics, storage and management market is divided into pharmaceutical and biotechnology companies, research centers, academic and government institutions, hospitals and clinics, others.

Hospitals and clinics are expected to grow over the forecast period.

Survey before buying this report @ https://www.quincemarketinsights.com/enquiry-before-buying/enquiry-before-buying-87693

HPC, data analytics, storage and management market, by region

Based on region, the HPC, Data Analytics, Storage and Management market is segmented into Asia-Pacific, Middle-East a and Africa, North America, Europe, and South America.

The United States dominated the HPC and data analytics, storage and management sectors in 2018. Due to the prominence of several pharmaceutical and biotechnology companies that focus on drug research and generate vast volumes of data

Recent development in HPC, data analytics, storage and management market

  • In 2018, AMD launched the AMD Radeon Instinct MI60 and MI50 accelerators, the world’s first 7nm data center GPUs for next-generation deep learning, HPC, cloud computing and rendering applications to improve computer performance. It would help researchers and scientists to effectively deal with HPC applications in various industries such as life sciences, universities, etc. This has helped AMD expand its product portfolio in the AI, cloud computing, and HPC segment.
  • In 2017, Microsoft (US) and Advanced Micro Devices, Inc. (AMD) worked together to integrate Project Olympus, Microsoft’s latest hyperscale cloud hardware design, with the cloud capabilities of the next-generation “Naples” processor from AMD. The new solution will provide additional memory, faster I/O channels, increased scalability, performance and efficiency through this collaboration. This will help the company expand its server and data center markets.
  • In 2017, Cisco Systems, Inc. announced the acquisition of Viptela, an American provider of cloud, software and network solutions. Cisco Systems will combine technologies offered by Viptela, such as cloud-based network management, routing platforms and solutions, to provide secure and intelligent platforms for digital transformation. This will expand Cisco’s SD-WAN portfolio by improving the flexibility and performance delivered through the cloud.
  • In 2017, Viptela, an American provider of cloud, software and network solutions, was acquired by Cisco Systems, Inc. Viptela’s technologies, such as cloud network management, routing platforms and solutions, will be combined by Cisco Systems to provide secure and intelligent platforms for digital transformation. This will increase the flexibility and performance offered by the cloud, expanding Cisco’s SD-WAN portfolio.

Some key points from the Global HPC, data analytics, storage and management The market reports are:

  • In-depth segmentation of the global HPC, data analytics, storage and management market, along with trend-based information and factor analysis.
  • Key Players of Global HPC, Data Analytics, Storage and Management Market include Advanced Micro Devices, Inc., Cray, Inc., Cisco Systems, Inc., IBM Corporation, Intel Corporation, Lenovo Group Limited (China) and Hewlett Packard Enterprise. Company (USA).
  • The effects of COVID-19 on the HPC, analytics, storage and data management market globally

Find more information on this topic in this report,HPC, Data Analytics, Storage and Management Marketby products and services (data analysis software and workshops, data analysis services, storage, management and cloud computing solutions), by application (next generation sequencing, microscopy, chromatography, flow cytometry, spectroscopy, others), by end user (pharmaceutical and biotechnology companies, research centers, academic and government institutions, hospitals and clinics, others), by region (Asia-Pacific, Middle East and Africa, North America, Europe and South America South) “.

Buy Now Full Report @ https://www.quincemarketinsights.com/insight/buy-now/hpc-data-analysis-storage-management-market/single_user_license

Contact us:

Ajay D

Overview of the quince market

Pune India

Phone: USA +1 208 405 2835

UK +44 1444 39 0986

APAC +91 706 672 4848

Email: sales@quincemarketinsights.com

Web: www.quincemarketinsights.com

Browse related reports:

Cloud High Performance Computing (HPC) Market, By Service Type (IAAS HPC, PAAS HPC, Data Organization and Workload Management, Clustering Software and Analytics Tool, Professional Service and Managed Service), by deployment model (public cloud, private cloud, hybrid cloud), by organization size (small and medium enterprises (SMEs), large enterprises), by end user (academia and research, biosciences, design and engineering, financial services, government, manufacturing, media, online entertainment and gaming, weather and environment), by region (North America, Western Europe, Eastern Europe, Asia-Pacific, Middle East and Rest of the World) – Size of the market and forecasts until 2028

https://www.quincemarketinsights.com/industry-analysis/cloud-high-performance-computing-hpc-market

Cellulose Ether and Derivatives Market, By Product Type (Methylcellulose and Derivatives, Carboxymethylcellulose, HEC, HPC, EC), Application (Construction, Pharmaceutical, Personal Care, Food & Beverage), By Region (North America, Europe, Asia-Pacific, Middle East & Africa and South America) – Market Size and Forecast to 2030

https://www.quincemarketinsights.com/industry-analysis/cellulose-ether-derivatives-market

High Performance Computing (HPC) Market, By Component (Solutions (Servers, Storage, Networking Devices, and Software) and Services), Deployment Type, Organization Size, Server Price Range, Application Area, By Region (North America, Europe, Asia-Pacific, Middle East & Africa and South America) – Market Size and Forecast to 2030

https://www.quincemarketinsights.com/industry-analysis/high-performance-computing-hpc-market

]]>
Quantum StorNext File System Now Available as a Subscription Service on AWS Marketplace https://it-talk.org/quantum-stornext-file-system-now-available-as-a-subscription-service-on-aws-marketplace/ Fri, 09 Sep 2022 11:58:00 +0000 https://it-talk.org/quantum-stornext-file-system-now-available-as-a-subscription-service-on-aws-marketplace/ Quantum, exhibiting at IBC in booth 7.C39, announced the immediate availability of Quantum’s StorNext File System as a subscription offering on AWS Marketplace. AWS Marketplace is one of the fastest ways to deploy StorNext shared storage and allows users to connect from anywhere to edit videos in the cloud as a team. Users can speed […]]]>

Quantum, exhibiting at IBC in booth 7.C39, announced the immediate availability of Quantum’s StorNext File System as a subscription offering on AWS Marketplace. AWS Marketplace is one of the fastest ways to deploy StorNext shared storage and allows users to connect from anywhere to edit videos in the cloud as a team. Users can speed up post-production workflows by accessing data and collaborating remotely, eliminating the need to copy or transfer files between users.

Physical and cloud instances of StorNext can easily move, sync or replicate content to unify workflows, and ingest from or publish to cloud storage such as ActiveScale or Amazon S3 services such as Amazon S3 Glacier

“Today’s announcement builds on the work Quantum and AWS are doing to help customers accelerate their journey to the cloud. This is the first time Quantum software has been available for licensing and deployment on public cloud infrastructure, making it faster than ever to deploy and use StorNext,” said Nick Elvester, General Manager Primary Storage, Quantum. “Now our customers can quickly establish a shared cloud storage environment for creative users and easily move data between their on-premises and cloud StorNext environments to meet new production demands.

StorNext is offered in a subscription model through AWS Marketplace without requiring any hardware infrastructure build, custom configuration, or client software installation. This represents another step in Quantum’s cloud strategy to meet customer needs for agile production content storage operations, which now include StorNext, ActiveScale™ object storage systems supporting AWS Outposts and Quantum’s leadership position in providing cold storage infrastructure to some of the world’s largest hyperscalers.

StorNext on AWS Marketplace is available in a choice of subscription periods in four configurations, starting with a full StorNext environment of 12 terabytes (TB) of capacity. Each deployment can complement existing StorNext installations in a hybrid model or as a fully cloud-based model to meet emerging customer and partner needs and requirements.

Each StorNext solution includes a complete StorNext shared storage environment and includes:

  • StorNext file system deployments in a choice of configurations from 12 TB to 62.5 TB, with pre-configured, out-of-the-box StorNext services to manage and monitor clients, storage volumes, and data movement services such as Quantum FlexSync™ and FlexTier™.
  • A range of subscription plans from monthly to multi-year contracts to fit customers’ OPEX budgets. Usage-based consumption patterns are factored into customer-negotiated AWS Enterprise Discount Program (EDP) purchase commitments.
  • Available as a private offer from authorized Quantum and AWS consulting partners, such as Integrated Media Technologies and ThunderCat, for custom pricing and other services.
  • Efficient support for Amazon Simple Storage Service (Amazon S3) for mass data storage, as well as Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Elastic Block Store (Amazon EBS).
  • Easy connection via SMB with no client software installation or configuration required.
  • Optional Quantum Distributed LAN Client (DLC) available at no additional cost to provide native platform file system experience on macOS, Windows and Linux clients.
  • Optional tiering of data and content from StorNext volumes to Amazon S3, Amazon S3 Glacier, or other AWS storage services.

“We have long been a trusted architect of StorNext workflows on physical and converged appliances. Now, as an AWS Consulting Partner, we have a range of deployment options to deliver StorNext environments on AWS to help customers meet and plan for any data flow, analytics, or protection need,” declared Nic Perez, CTO for Cloud at ThunderCat Technology.

“AWS Marketplace helps customers innovate faster and source solutions more easily,” says Mona Chadha, Director of AWS Marketplace and ISV Alliances. “Having Quantum’s StorNext on AWS Marketplace will enable customers to build highly agile data and content workflows on AWS, meeting critical business needs and accelerating time to market.

Customers can also purchase bundled professional services offerings with StorNext on the AWS Marketplace from Quantum resellers and AWS consulting partners like Integrated Media Technologies, Inc. (IMT) and ThunderCat.

]]>