Dell Technologies recently announced its plan to implement a new system for tracking employees who work on-site and remotely. The system will include a coloured code, VPN monitoring, and badge-in tracking. The new initiative is scheduled to begin on May 13th; however, it has raised concerns regarding employee privacy from both within and outside the company.
First reported by The Register, citing sources familiar with Dell’s internal workings, the company’s decision stems from a directive by Jeff Clarks, its chief operating officer, who is leading an effort to enhance onsite presence among hybrid workers.
Employees in hybrid roles are expected to be onsite at a Dell office for at least 39 days per quarter, averaging three days a week.
“Employees who do not meet the attendance requirement will have their status escalated up the ladder to Jeff Clarke, who apparently believes that being a hall monitor trumps growing revenue,” said the source.
The tracking system, which includes weekly site visit data accessible through the company’s human capital management software, assigns colour-coded ratings to employees based on their on-site presence. These ratings range from ‘Blue’ for consistent on-site presence to ‘Red’ for limited on-site presence, with ‘Green’ and ‘Yellow’ flags indicating varying degrees of on-site presence.
However, implementing this system has not been without challenges. There is a lack of clarity regarding the consequences of each colour tier, with conflicting messages from different managers. Some managers emphasise the importance of maintaining a ‘Blue’ rating at all times, while others suggest a more flexible approach, acknowledging occasional ‘Red’ flags.
![Dell's employee tracking system, effective May 13, raises privacy concerns 1 Data is the fuel that keeps the tech world running and was recently said to have come at par with the petroleum industry. There is no doubt that companies are focusing on collecting large quantities of data. Facebook, one of the tech titans, has gained a notorious reputation for collecting data without user consent. Governments, too, are diving into the data collection field. The prime example is the aadhar card initiative in india, which was rolled out by the bjp-led nda government. Collection of huge data inevitably poses data security risks. There have been incidents of data leaks where user's personally identifiable information was exposed. Differential privacy provides a way for the companies to collect and share the data but without risking the personal information. For instance, a survey that aims to find out how many people, from the given set of 100 people who watch television, watch netflix or prime, will treat the answers as a data set instead of an individual -- keeping anonymity intact on paper. So, instead of analysing each individual from the set, we get an overall figure. The figure might look something like 70 out of 100 people watch say, netflix. But the identity of those 70 people who watch netflix or the remaining 30 that don't isn't revealed by the data set. Differential privacy uses algorithms for data anonymisation. In simple terms, differential privacy is a more robust and mathematically powerful definition of data privacy. Cynthia dwork is credited to be the founder of this technique. According to a research paper authored by cynthia dwork and aaron roth, titled the algorithmic foundations of differential privacy, " differential privacy describes a promise, made by a data holder, or curator, to a data subject [that] you will not be affected, adversely or otherwise, by allowing your data to be used in any study or analysis, no matter what other studies, data sets, or information sources. Are available. " need for differential privacy data privacy is paramount. Big data collection and analysis is necessary for companies to understand human behaviour and motivations in order to be able to judge consumer trends and market their products accordingly. Big data has a tremendous scope and the market for the same is exponentially growing, but so are the risks. In recent years, micro-data, which is essentially the information about the individual, is becoming public. Micro-data contains the most private and varied aspects of an individual and thus are most susceptible. Protection from linkage attacks in 2006, netflix published a dataset of about 500,000 users, to support the netflix prize data mining contest. In this contest, they randomised some data and hid the others. However, researchers were able to demonstrate that even the most anonymised data can be breached. For sparse datasets, such as that of netflix, an attacker with the least technical knowledge can perform a data breach. The researchers juxtaposed the data from the imdb over the netflix datasets and found out the user's name. Thus solidifying the need for differential privacy. Conclusion irrespective of individual differential privacy also solves the paradox of 'learning nothing about the individual while learning useful information about a population' by making the conclusion independent of the individual. In other words, it does not matter whether the individual opts out or stays in the study, the conclusions will remain the same nonetheless. In differential privacy, each output is likely to occur equally, irrespective of the presence of the participant. Ambiguous queries sometimes the queries, in itself are problematic and ambiguous. Suppose, 'a' has a known condition or a habit. There are two research questions about a's condition: how many people in a given dataset have the same condition or habit as a? How many people in the dataset, not having the name a, have the condition or habit? Thus, from the above example, it is clear that a's condition can be deduced easily unless differential privacy is applied. Also read: 9 new privacy and security features announced at google i/o 2019 how does it work? Differential privacy is a prime tool to prevent differentiated attacks. Differentiated attacks follow the process given below. Let us consider a situation where we have to find out about who watches netflix or prime. In a data set of 100 people, an attacker wants to find out about sunny's habits. He already knows that 70 out of 100 people watch netflix. An attacker, by obtaining background information about other 99 people, can easily say that 30 watch prime and 69 watch netflix. Sunny, being the 100th member can only watch netflix as 70 out of 100 people watch netflix. Differential privacy guarantees protection against such attacks by inserting a random 'noise' to the data. Noise is a carefully designed mathematical function that gives the probability of the events occurring in an experiment. In the above situation, instead of using a 70/30 ratio, we can use odd ratios like 69/31. In this way, it is difficult to reach sunny individually, but the overall ratio remains nearly the same. Thus, the noise adds additional algorithms to the whole process. They are particularly useful if there is data about some confidential habits. Some common noise mechanisms are laplace distribution and gaussian distribution. Challenges with differential privacy as we have seen, differential privacy protects the user's privacy while allowing the data aggregation. However, there are several limitations, which are as follows. Differential privacy works only when the aggregated data is extensive. For less extensive data, this technique is not much useful. This technique is not helpful where there is an unequal summation of data. For example, in a data aggregation about incomes, inclusion or exclusion of one individual can change the result considerably. The reason is that the incomes are unevenly distributed is that the top 20 percent earn exponentially higher than the rest of the 80 percent. Therefore, the exclusion of any top 20 percent member will affect the result. In queries involving a series of private questions, we have to add more noise to obfuscate the identity. More the queries, more the noise, which makes it difficult to derive anything useful from the data. In the garb of differential privacy, companies can now collect even more data from the users. Big giants like apple, google are already applying differential privacy for protecting user data. Apart from that, software companies like privitar are also applying this method. Differential privacy has also found implications in cloud security. Thus, differential privacy is gaining momentum in recent years and is likely to continue along the upward trajectory in the foreseeable future. Also read: what is a credential-based cyberattack?](https://candid.technology/wp-content/uploads/2019/06/Privacy-31319-1024x576.jpg)
Critics within the organisation have characterised the situation as disorderly, expressing worries about possible consequences for employees failing to meet the attendance criteria. Dell’s prior ‘return to office’ strategy, unveiled earlier this year, has garnered attention for its potential impact on career progression and the prospect of layoffs for remote staff members.
The tracking measures, which encompass badge-ins and VPN connections, aim to ensure that employees are physically present when they claim to be. They deter practices like ‘coffee badging,’ where employees scan their badges and then leave the office immediately. These measures are believed to respond to the significant number of staff members opting to work remotely following the earlier return-to-office mandate.
Dell’s workforce composition reveals a significant proportion of remote workers, particularly outside the US, where approximately two-thirds of employees are remote. The company’s workforce includes field and hybrid employees, making up the remaining workforce. Despite recent layoffs and financial filings indicating around 120,000 employees, internal figures suggest a higher workforce count of up to 150,000, raising questions about the company’s reporting practices.
The aggressive stance on onsite tracking and the potential for further layoffs have led to speculation about Dell’s future workforce strategy. With industry trends showing a softening market and increased competition, employees are bracing for potential workforce reductions in the coming months.
Companies must not track their employees via VPNs, even if the market demand decreases. Instead, it is crucial to respect employees’ privacy and issue a clear and concise company notification.
In the News: Nintendo Switch’s X integration ends on June 10