Connection Community https://community.connection.com/ Official Technology Community of Connection Wed, 25 Mar 2026 16:30:13 +0000 hourly 1 https://wordpress.org/?v=6.9.4 Choosing the Right AI PC for Work Starts... https://community.connection.com/choosing-the-right-ai-pc-for-work-starts-with-intel-core-ultra-200v/ Mar 25, 2026 Jeff McCobb https://community.connection.com/author/jeff-mccobb/ https://community.connection.com/wp-content/uploads/2026/03/3582722-Intel-Multi-Phase-Campaign-PH2-BLOG.jpg

Business laptop refresh decisions look different than they did just a few years ago. What was once a routine hardware cycle now must account for Windows 11 requirements, modern security standards, and a growing number of AI-powered features running directly on the device. IDC describes this shift as the start of the “AI PC era,” where more AI workloads move onto the PC to improve responsiveness and everyday productivity.

Many organizations have already enabled AI tools across their workforce—only to discover that older laptops weren’t built for sustained local AI workloads. The result can be slower performance, system freezes, louder fans, and reduced battery life. Instead of improving productivity, AI can strain hardware that lacks dedicated acceleration designed to handle these tasks efficiently.

For IT, procurement, and finance teams, the challenge is selecting laptops that genuinely support modern AI-enabled work while remaining secure, manageable, and cost-effective—without overbuying performance most employees don’t need. For many organizations, Intel® Core™ Ultra 200V processors offer a practical default, combining mobility, efficiency, and built-in AI acceleration in a way that aligns with how most business professionals actually work.

What Business Professionals Actually Need from a Work Laptop

Most business users aren’t running engineering simulations or advanced creative software. Their day revolves around meetings, browser tabs, documents, dashboards, and constant application switching.

Typical workloads include:

  • Collaboration tools such as Microsoft Teams or Zoom
  • Office productivity and browser-based SaaS applications
  • Multitasking across email, chat, documents, and internal systems
  • Secure access to company data from the office, home, or while traveling

In this environment, consistent responsiveness matters more than peak benchmark performance. Employees notice when laptops hesitate during meetings, struggle under multitasking loads, or require frequent charging throughout the day.

As AI features become embedded in everyday tools, such as transcription, summarization, and background effects, systems without dedicated AI acceleration can place additional strain on the CPU and GPU. That strain affects performance, battery life, and overall user experience.

Organizations are investing in AI to improve productivity. To see measurable gains, employees need laptops designed for sustained local AI workloads—not systems that force the CPU or GPU to handle tasks they weren’t optimized for. That means choosing processors with dedicated AI acceleration built in, so AI features can run efficiently without compromising performance or battery life.

Why Intel® Core Ultra 200V Is a Strong Default for Business Professionals

Intel® Core Ultra 200V processors are built for thin-and-light business laptops that prioritize mobility, battery life, and on-device AI performance. As Intel’s most efficiency-focused family within its broader portfolio of AI-accelerated processors—often referred to as Intel® AI chips—the 200V series is designed for highly mobile professionals and Copilot+ PC–class experiences.

Intel® Core 200V integrates three key components into a balanced platform:

  • A CPU optimized for everyday productivity and multitasking
  • Integrated Intel® Arc™ graphics for modern business visuals and media workloads
  • A fourth-generation Neural Processing Unit (NPU) designed for sustained, low-power AI processing

For collaboration-heavy professionals, that dedicated NPU makes a practical difference. Instead of forcing AI features to compete with other applications for CPU or GPU resources, supported AI workloads can run more efficiently in the background. The result is a laptop that remains responsive during meetings, multitasking, and travel while preserving battery life across a full workday.

Intel® positions the 200V series as its most efficient x86 processor family to date within its class, supporting up to 120 total platform TOPS across the CPU, GPU, and NPU. The NPU alone meets Microsoft’s 40+ TOPS requirement for Copilot+ PCs, enabling advanced on-device AI features without stepping up to higher-power processor tiers designed for engineers or creators.

For organizations refreshing business laptops at scale, this balance is what makes Intel® Core Ultra 200V a strong default. It delivers the performance most knowledge workers actually use, the battery life mobile employees depend on, and the AI capability required for modern Windows experiences without the cost and complexity of over-specifying the fleet.

The Three Biggest Reasons Intel® Core™ Ultra 200V Fits Business Pros

For most business professionals, the value of Intel® Core Ultra 200V shows up in how reliably it handles everyday work.

1. Mobility-first Performance for Everyday Work

Intel® Core Ultra 200V is designed to deliver fast, consistent performance for productivity and collaboration workloads without the higher power draw of performance-oriented processor tiers.

Its efficiency-focused architecture balances performance cores and low-power cores to keep foreground applications responsive while background tasks run smoothly. The result is steady performance throughout the day, even during heavy multitasking.

Because the 200V series is built for thin-and-light systems, it enables the portable, quiet form factors many mobile professionals prefer. And by meeting Microsoft’s Copilot+ PC requirements, it supports advanced on-device AI features that increasingly complement everyday business workflows.

2. Battery Life That Supports Full Workdays

Battery life is a primary consideration in modern business environments, where hybrid work, travel, and back-to-back meetings are common.

The Intel® Core Ultra 200V series is designed with power efficiency as a priority, reporting up to 50% lower package power compared to prior generations—depending on configuration. That efficiency translates into thin-and-light systems capable of supporting 20+ hours of battery life (according to industry benchmark standards), allowing mobile professionals to move through a full workday—including AI-assisted features running in the background—without constantly searching for a charger.

For most business roles, sustained battery life has a greater impact on productivity than incremental gains in peak performance. The ability to stay responsive and untethered throughout the day is what makes an efficiency-focused processor tier the smarter default for mobile-first users.

3. On-device AI Readiness without Overbuying

AI features—such as real-time transcription, background effects, summarization, and translation—are becoming standard in everyday work tools. Increasingly, these capabilities run directly on the device rather than in the cloud, which helps support security and privacy. This shift is driving interest in Intel AI chips designed to handle AI workloads locally, without relying entirely on cloud processing.

To support those features consistently, laptops need dedicated AI acceleration—not just faster CPUs. Intel® Core Ultra 200V includes a built-in NPU designed to handle supported AI workloads locally and efficiently. This type of local processing can help reduce data exposure for routine tasks such as meeting transcription and summarization.

By offloading these tasks from the CPU and GPU, the system can maintain responsiveness and battery life even as AI features expand within Windows and business applications.

IDC research highlights the growing reliance on on-device AI processing to improve responsiveness and user experience. Intel® Core 200V aligns with this shift, enabling organizations to support emerging AI use cases without automatically moving users into higher-power processor tiers designed for specialized technical roles.

For many business professionals, that balance is what matters most: modern AI capability, consistent performance, and practical fleet standardization without overbuying hardware built for engineers or creators.

What Intel® Core Ultra 200V Is Best for and When to Choose Something Else

Intel® Core 200V is well-suited for business users whose day revolves around collaboration, productivity applications, and mobility.

Best-fit roles include:

  • Executives and managers
  • Sales professionals and consultants
  • General knowledge workers
  • Collaboration-heavy teams working primarily in meetings, documents, and browser-based tools

For these users, the priority is consistent responsiveness, strong battery life, and support for modern on-device AI features—not sustained workstation-class performance.

When to consider H or HX instead:

  • Engineers running CAD, simulation, or modeling software
  • Data science and analytics roles requiring sustained CPU or GPU performance
  • Advanced creative workflows, such as 3D rendering or heavy video production

For these specialized roles, higher-power processor tiers may be appropriate. Intel® Core Ultra H and HX are both built for heavier workloads, but the right fit depends on whether you’re prioritizing stronger graphics performance or maximum CPU headroom. For most other business professionals, Intel® Core 200V provides the right balance of efficiency, AI capability, and mobility.

Processor TierTypical Use Case
Intel® Core Ultra 200VMobile-first knowledge workers and collaboration-heavy roles
Intel® Core Ultra UStandard productivity users and cost-controlled deployments
Intel® Core Ultra HEngineering, technical, and creative specialists needing higher sustained performance and strong integrated graphics
Intel® Core Ultra HXCPU-intensive engineering and analytics roles; often paired with discrete graphics, but configuration varies by system

Building a Smart Intel® Core Ultra 200V Business Configuration

The processor is only one part of a successful laptop refresh. Memory, storage, connectivity, and security configuration all influence performance, longevity, and user satisfaction.

A practical Intel® Core 200V business standard typically includes:

  • Memory: 16GB as a baseline, with 32GB for heavier multitasking or longer refresh cycles
  • Storage: 512GB to support modern applications, local data, and AI-enabled workflows
  • Connectivity: Wi-Fi 6E or Wi-Fi 7 and Thunderbolt™ for consistent, high-bandwidth docking and peripheral support
  • Security and manageability: TPM, secure boot, and Intel® vPro®
  • Form factor: Thin-and-light systems for mobile teams; larger designs for users with heavier workloads

In many cases, these specifications align with how current business laptops are already configured, making it easier to standardize across the fleet while avoiding unnecessary over-specification.

Standardize Smarter with Intel® Core Ultra 200V

For organizations refreshing laptops at scale, standardization reduces cost, complexity, and operational risk. Choosing the right processor tier is one of the most important decisions in that process.

Intel® Core Ultra 200V represents a practical default for modern business professionals. It supports collaboration-heavy workloads and on-device AI features without moving users into higher-power processor tiers designed for specialized technical roles. The result is a fleet standard that balances mobility, battery life, responsiveness, and AI readiness.

For many organizations, that balance is what makes Intel® Core Ultra 200V the smart starting point for an AI PC refresh.

Take the Next Step

If you’re finalizing your laptop refresh standard or evaluating your AI PC strategy:

Download the Intel® Processor Comparison Guide to evaluate processor tiers side by side.
NEED Non-Google DropBox Link.

View laptops powered by Intel® Core Ultra processors.

To discuss how Intel® Core Ultra 200V fits into your broader AI PC refresh strategy, talk to a Connection specialist.

]]>
Trust Falls and Agentic Calls https://community.connection.com/trust-falls-and-agentic-calls/ Mar 23, 2026 Rai Basharat https://community.connection.com/author/raibasharat/ https://community.connection.com/wp-content/uploads/2026/03/3624971-Helix-Healthcare-AI-Trust-Falls-and-Agentic-Calls-BLOG.jpg

What Healthcare Leaders Told Us about AI Trust at HIMSS 2026

I opened the room with a simple ask: “Raise your hand if you would trust an AI agent to schedule your own mother's surgery.” Most hands went up. No surprise there. Then I followed with, “Now keep it raised if you’d trust it to approve her prior auth.”

Here is what I did not expect. Contrary to my own preconceived notions, the hands mostly stayed up. A few people never raised their hand in the first place, which was honest and worth noting. But the room did not give me the dramatic drop I had anticipated. These healthcare leaders were more comfortable with autonomous AI than I assumed they would be, and that told me something important: the industry has moved faster in its thinking than many of us advisors have given it credit for.

This was our focus group at HIMSS 2026 in Las Vegas: “Trust Falls and Agentic Calls: Healthcare's Next Leap." Fifty-five minutes with health system leaders, clinical informaticists, payer-side operators, and health IT executives. We talked about where agents belong, where they don’t, who is liable when they get it wrong, and what patients actually expect. We also ran a live survey of 21 participants. What came back was honest, complicated, and occasionally contradictory—which is to say, it sounded like real people thinking through a hard problem.

In one or two words, what is the absolute key to mainstream adoption of agentic AI in healthcare? 5 respondents (24%) answered Trust for this question.

And the timing could not be more relevant. Healthcare AI is no longer experimental. According to Menlo Ventures, 22 percent of healthcare organizations have now implemented domain-specific AI tools, a tenfold increase over 2023. Health systems are leading adoption at 27 percent, followed by outpatient providers at 18 percent and payers at 14 percent. The money is moving too: ambient clinical documentation alone generated $600 million in revenue in 2025, up 2.4 times year over year, and coding and billing automation added another $450 million. The agentic AI in healthcare market is projected to grow from $1.8 billion in 2026 to nearly $20 billion by 2034. This is not a pilot conversation anymore. This is an industry that is spending real money and discovering, in real time, that the governance has not kept up.

The Room Is Optimistic. Cautiously.

Fifty-seven percent of our survey respondents described themselves as “cautiously optimistic but needing more proof.” Thirty-eight percent said they were “excited and ready for implementation.” One person, five percent, was highly skeptical. Nobody said they were completely opposed. That distribution tracks with what I hear in client engagements every week: people want this to work—they just don’t want to be the ones it fails on first.

And the proof they want is operational, not theoretical. They are not waiting for another white paper. They want to see a prior auth agent actually reduce denials at a health system that looks like theirs. They want to see a call center triage bot handle 10,000 calls a week without a compliance incident. Consider the math behind that urgency: an AMA survey from 2025 found that clinicians complete roughly 39 prior authorizations per week and spend about 13 hours on the process, with most reporting that it contributes directly to burnout. McKinsey estimates that AI-enabled revenue cycle management could deliver a 30 to 60 percent reduction in cost to collect. Health systems collectively spend more than $140 billion annually on revenue cycle operations, and the CAQH Index pegs the savings opportunity from automating routine transactions like eligibility, claims, and prior auth at $20 billion. The ROI case is not theoretical. It is staring at the ceiling of every CFO’s office.

Hallucinations Keep Everyone Up at Night

When we asked about the single biggest barrier to trust, 57 percent said hallucinations and clinical inaccuracy. That number did not surprise me. What did surprise me was how far ahead it was. Loss of the human-to-human empathy connection came in at 19 percent. Data privacy and security at 14 percent. Hidden algorithmic bias at 10 percent.

I expected privacy to rank higher, honestly. HIPAA has been the dominant conversation in healthcare IT for two decades. But this group told us something different: they are less worried about data leaking out and more worried about bad reasoning going in. A hallucinated care pathway, a confidently wrong claim code, an agent that auto-approves something it should have flagged. That kind of failure is hard to catch because it looks like competence. And it is not just a theoretical risk. Payers are now deploying AI systems that can review and deny claims in seconds, processing denials at a scale and speed that manual provider workflows cannot match. The percentage of providers reporting denial rates above 10 percent surged from 30 percent in 2022 to 41 percent in 2025. When both sides are running agents, the accuracy question becomes an arms race.

We also asked an open-ended question: “In one or two words, what is the absolute key to mainstream adoption of agentic AI in healthcare?” The answers clustered hard. Accuracy. Trust. Transparency. Repeatedly. One respondent wrote, “Cybersecurity, i.e., trust. Are we meeting FDA, NIST/FedRAMP, or IEEE UL 2933 standards?” Another wrote simply, “Oversight guardrails.” A third said, "Fail-safe options for human in the loop with simple communication.” These are people who think in systems, not slogans.

71 Percent Would Look Under the Hood

Here is the finding I keep coming back to. We asked, “If an AI agent recommends a care pathway that contradicts your initial clinical judgment, what is your most likely response?” Seventy-one percent said they would dive into the AI’s logic and citations to see what they might have missed. Nineteen percent would consult a human colleague for a tie-breaker. Only 10 percent would reject the AI’s suggestion outright.

Think about that for a second. Seven out of ten clinicians and clinical leaders in this room said their first instinct, when an AI disagrees with them, is to check whether the AI might be right. That is not the response of people who fear the technology. That is the response of professionals trained to follow evidence wherever it leads. But it puts enormous pressure on explainability. If most of your clinical workforce is going to open the hood when the AI challenges them, the engine underneath had better make sense.

Which brings us to what explainability actually means to these people. Forty-eight percent said the most important feature is a plain-English summary of the system’s decision-making steps. Twenty-four percent wanted links to peer-reviewed literature supporting the AI’s action. Another 24 percent wanted a confidence score displayed in a clear UI. Five percent prioritized an instant “escalate to human” button. Nobody wants a black box. They want reasoning they can read, evaluate, and explain to the patient standing in front of them. That aligns with the Joint Commission and CHAI’s Responsible Use of AI in Healthcare guidance, released in September 2025, which calls on health systems to build formal governance structures with mechanisms for disclosing AI use and educating both staff and patients. The Joint Commission is developing a voluntary AI certification program for its network of 22,000 accredited healthcare organizations. The industry is formalizing what our focus group already knew intuitively: explainability is not a nice-to-have. It is a clinical requirement.

Patients Are Already Ahead of Us

Sixty-seven percent of respondents agreed that within five years, patients will prefer the speed of an autonomous AI for minor urgent care over waiting for a human. One-third disagreed. But even the skeptics acknowledged the pressure is real. Patients have already made this choice in banking, tax prep, and grocery delivery. Healthcare will not stay the exception forever.

Seventy-one percent of our respondents said patients must be told whenever AI handles their logistics. Full transparency, every time. The remaining 29 percent preferred a conditional approach: disclose only when the AI impacts clinical care directly. Nobody chose not to disclose at all. Zero. That is worth sitting with for a moment.

But disclosure alone is not enough. A national CHAI survey of 1,456 patients, conducted by NORC at the University of Chicago, found that 93 percent of patients reported at least one concern about the use of AI in healthcare and 51 percent said AI actually makes them trust healthcare less. However, more than 80 percent said that trust would increase if clear accountability measures were in place. The data is telling us that transparency without accountability feels performative. Patients do not just want to be told AI is involved. They want to know who is responsible when it gets something wrong.

What I found even more interesting was the generational tension underneath these numbers. In the room discussion, several providers admitted they feel less compelled to explain AI’s role to older patients who don’t ask about it. Meanwhile, Gen Z patients are walking into appointments having already consulted three AI tools, compared treatment options on a symptom checker, and read a Reddit thread about their diagnosis. They are not passively receiving care. They are researching before the provider enters the room. One participant put it bluntly: “I have to be more prepared now because the patient already is.” The training norms inside organizations have to catch up with this. Staff need to know how to explain AI’s role, how to override a recommendation when their judgment says otherwise, and how to document the handoff between human and machine. Most organizations have not built this into onboarding or continuing education yet.

Nobody Knows Who Is Liable, and That Is a Problem

We asked who should bear primary legal liability if an autonomous clinical AI makes a harmful error. Sixty-seven percent said we need an entirely new model of shared liability. Fourteen percent pointed to the software vendor. Fourteen percent said the attending physician. Five percent said the health system. The current frameworks were built for a world where a human being made every clinical decision. When an AI agent auto-codes a charge and triggers an audit, or initiates a prior auth that delays care, the liability question gets murky in a way nobody has resolved yet.

The industry is basically saying: we know this is coming, we know the rules don’t fit, and we need new ones before something goes wrong. Until those rules exist, every agentic deployment carries legal risk that governance has to address explicitly. Oklahoma’s HB1915 proposed a comprehensive framework requiring governance bodies for AI oversight and performance evaluations tied to patient outcomes. Similar bills are expected across multiple states in 2026. Meanwhile, the Joint Commission’s voluntary AI certification will likely become a de facto standard. The regulatory patchwork is forming fast, and organizations without a governance framework are going to find themselves scrambling.

This is also why billing models are shifting. The smarter vendors are moving toward outcome-based pricing. If an agent improves your clean claim rate by 15 percent, the vendor earns based on that result. If it doesn’t deliver, the cost adjusts. That kind of pricing is not just commercially appealing. It is a governance mechanism. It forces vendors to own the performance of their agents instead of shipping a product and collecting a license fee. When your vendor’s revenue depends on the agent working correctly, you have a fundamentally different accountability relationship than when they just sell you seats.

Where Agents Work and Where They Don’t (Yet)

The focus group was clear: start with admin, not clinical. Scheduling, patient access call centers, claims status, revenue cycle workflows. These are high-volume, repeatable processes with measurable outcomes and clear escalation paths. One participant’s patient access center handles 10,000 calls a week, and an AI agent could triage 60 percent of them. Others are already piloting transfer-of-care coordination agents that save hours of nursing time daily.

The vendor ecosystem is moving in the same direction. Waystar announced in January 2026 that it is building an end-to-end autonomous revenue cycle using an agentic network. Epic now connects more than 1,000 hospitals and 22,000 clinics to TEFCA via Epic Nexus. Oracle’s Health Clinical AI Agent reported a roughly 30 percent reduction in daily documentation time across more than 30 specialties. UiPath launched agentic AI for prior authorization and claim denial management at ViVE 2026. Startups like VoiceCare AI are piloting agents at Mayo Clinic that make outbound calls to payers for benefit verification, sitting on hold for up to two and a half hours so staff don’t have to. McKinsey reports that in 2025, more than 30 percent of providers prioritized AI implementation for seven specific revenue cycle use cases, up from four or five in 2023 and 2024. The market is not waiting around.

Clinical decision-making is a different conversation, and the room was unanimous about it. A licensed operator has to stay in the loop for anything touching diagnosis, treatment planning, or medication management. That is not a knock on the technology. It is a recognition that governance maturity has to catch up before autonomy expands. We asked the room, “If an agent improved your clean claim rate by 15 percent, but you couldn’t fully explain how it coded a charge, would your compliance officer sign off?” The silence was its own answer.

Shadow AI Is Already Here

While leadership debates governance in the boardroom, the frontline has already decided. A Wolters Kluwer survey of 518 healthcare workers in December 2025 found that 40 percent had encountered unauthorized AI tools in their workplace and nearly 20 percent admitted to using them. Half said they did it for a faster workflow. A third said their organization lacked approved tools with the functionality they needed. One in ten said they had used an unauthorized AI tool for a direct patient care use case.

This is not a failure of people. It is a signal of unmet need. And the risk is real: IBM’s 2025 Cost of a Data Breach report ranked healthcare as the costliest industry for breaches for the fourteenth consecutive year, with the average breach costing $7.4 million. Twenty percent of surveyed organizations suffered a breach due to shadow AI specifically. The path forward is not to ban personal AI use. It is to provide governed alternatives that are fast enough and good enough that people stop reaching for the unauthorized ones.

When It Goes Wrong, How You Respond Matters More than How You Prevented It

We asked what the best response is after a trust-breaking event. Forty-three percent said a transparent post-mortem shared with all staff. Twenty-nine percent wanted to retrain the model with staff input. Nineteen percent said mandate a human in the loop for that workflow permanently. Only 10 percent said shut the system down.

I was struck by how measured these responses were. These are not people who would panic at the first error. They understand that AI systems will get things wrong, the same way human systems do. What matters is whether the organization responds with transparency or silence. The 43 percent who want post-mortems are describing a culture that learns from failure. The 29 percent who want staff involved in retraining are saying something I hear constantly in my work: we don’t trust a model we can’t shape.

The Real Barrier Is Not the Technology

I asked the room, “If I asked you to show me the documented process for how a claim moves from submission to collection, could you?” Most people laughed, which was the answer. You cannot automate a workflow that does not exist on paper. An agent cannot follow a care pathway nobody has mapped. The number one barrier to agentic AI in healthcare is not model accuracy or regulatory ambiguity. It is undocumented processes. And with 70 percent of healthcare leaders reporting early to mid-stage AI maturity, according to a recent industry assessment, the gap between ambition and readiness is still wide.

I also asked whether they were building an AI strategy or buying an AI platform, because those are two very different things and only one survives a vendor pivot. Your EHR vendor is going to ship AI features whether you are ready or not. Epic, Oracle, and a growing roster of startups are embedding agents into workflows right now. The organizations that come out ahead will not be the ones running the most agents. They will be the ones that can tell you exactly what each agent does, who is accountable when it fails, and what happens next.

One participant said something at the end that stuck with me. I had asked everyone to name one thing they would do differently about AI governance in the next 90 days. She said, “I’m going to stop treating governance like a project and start treating it like a practice.” I don’t think I can say it better than that.

Ready to build an AI governance practice, not just a plan?

Talk to CNXN Helix: cnxnhelix.com

]]>
Microsoft 365 E7 – The Frontier Worker... https://community.connection.com/microsoft-365-e7-the-frontier-worker-suite-the-intersection-of-security-compliance-and-ai/ Mar 19, 2026 Casey Lindsay https://community.connection.com/author/casey-lindsay/ https://community.connection.com/wp-content/uploads/2026/03/3614071-Blog-RUSH-Microsoft-365-E7-BLOG.jpg

Ever since the dawn of Copilot (circa early 2023), our customers have been asking about the possibility of Microsoft bundling Copilot with M365 E5. It was merely conjecture during those conversations, but I had a hunch that the day would come where we’d see E5 and Copilot in holy matrimony, riding off into the sunset, a literal match made in heaven! Let’s first be clear that this new Suite is called Microsoft 365 E7, not E5 + Copilot. And to be even more exact, “The Frontier Worker Suite.” And the fun does not stop there as far as what is included in this shiny new toy. We’ll get into the nuts and bolts of what’s included later, talk about pricing, and, finally, who I believe presents a proper use case for E7. But before we unpack all these goodies, I want to put a focus on why E7—and why now—with a recently announced general availability set for May 1, 2026.

The Foundation

I want to be clear that this is my own personal speculation, but I’ll call it substantiated speculation as it comes with 21 years of experience working very closely with Microsoft—including my current role as Principal Consultant and Lead Advisor in our Microsoft Licensing Optimization (MLO) practice. With the advent of AI and the increasing popularity and adoption of Copilot, bundling Copilot with E5 makes logical sense.

To properly deploy Copilot though, it is critical to address your security posture to prevent data breaches and ensure compliance. These measures include enhancing identity and access controls (think Zero Trust Architecture principles), establishing a mature data governance foundation (e.g. unified labeling, DLP, and audit capabilities across M365), and continuous monitoring to maintain control and meet regulatory requirements.

Excuse the plug, but here at Connection, we have workshops specifically designed to do just that: shore up all security gaps in advance of a Copilot deployment. So essentially, you can’t have one without the other—and chronologically, E5 should come before Copilot. I liken it to pouring the foundation before you build your house.

Why Now?

Again, personal speculation here, but I believe the timing of this E7 release is highly strategic and the culmination of several events all happening at once, compounding each other. First, Microsoft has done a fantastic job in seeding the market with Copilot. As an example, I see Copilot subscriptions in the portfolio mix of almost every one of my customers. But they all are at different stages of this AI journey. Let’s also not forget this year’s Super Bowl ad featuring NFL recruiters leveraging Copilot to scout players and make data-driven decisions. Copilot branding has become commonplace.

Second, we’re about to turn the clock and begin the final quarter of Microsoft’s fiscal year and the general availability release date of May 1 is clearly not a coincidence. Third, Microsoft has already announced price increases to many key Enterprise Suites, including M365 E5—which is increasing by 5.3% starting July 1, 2026. And lastly, Microsoft’s stock performance YTD.

I believe that the culmination of these four events has led to this E7 release prior to what would have historically been a key announcement in July to kick off Microsoft’s new fiscal 2027. I also believe it was advantageous for Microsoft to make this announcement now—and just as advantageous for their customers to build E7 into their portfolios. What better timing than Microsoft’s Q4 when Microsoft has historically been more willing to negotiate.

E5 vs. E7

Let’s dig into the incremental differences between E5 + Copilot as compared to M365 E7 and discuss some pricing logic. E7 layers on top of E5 + Copilot—both Entra Suite and Agent 365. Entra Suite is essentially Microsoft’s all-in-one Zero Trust access solution that brings together identity security, access governance, and network access controls under one cloud-based bundle. Entra Suite is designed to give organizations least privilege access everywhere (e.g. cloud apps, on-prem apps, and Web traffic based on identity, device, and real-time risk). You can securely control who can access which apps, from where, and under what conditions, without relying on legacy VPNs or disconnected tools. It is absolutely a step above what Entra Plan 1 and Entra Plan 2 can deliver.

With the increased adoption of AI across organizations, and the need to govern AI agents—much like human workers with a pulse—Microsoft’s Agent 365 creates an enterprise control plane for these AI agents. This allows organizations to see, control, secure, and manage these AI agents at scale across the M365 estate, Azure, and even third-party platforms. Agent 365 treats AI agents like digital employees, giving IT and SecOps one place to manage their identity, access, behavior, and lifecycle. Both Entra Suite and Agent 365 fit hand-in-glove when enhancing an organization’s security posture—especially in this age of rapid AI progression.

Pricing and Value

M365 E7 Suite is slated to cost $99 per user, per month. M365 E5 on an Enterprise Agreement is $57 per user, per month until July 1, when Microsoft will increase to $60. Copilot is another $30 per user, per month. We’re already $90 into that $99 price point. Entra Suite à la carte is $12 per user, per month. Microsoft Agent 365 will be $15 per user, per month, bringing us to a grand total of $117 per user, per month for the sum of the parts. The way this math shakes out, with M365 E7, you’re essentially getting Microsoft Agent 365 at no cost plus change to put right back into your pocket!

Final Thoughts

If you’re already invested in M365 E5 and Copilot today, E7 is worthy of serious evaluation and deep consideration. Additionally, if you have concerns about how to manage and govern AI agents, Agent 365 could be a great fit—which launches à la carte alongside M365 E7 on May 1.

At minimum, I would highly recommend gaining some level of preparedness in understanding the components of this suite, its pricing, and determining your own internal desire to make this investment. M365 E7 hits Microsoft pricelists in April and that means Microsoft will be ready to talk to YOU!

Within Connection, we have a deep bench of Microsoft experts prepared to support our customers on this new frontier, E7. We meet our customers exactly where they are on this journey. As the nature of modern work continues to evolve, we’re prepared to be there to advise and support them every step of the way!

]]>
Manufacturing Language Models: Turning Your... https://community.connection.com/manufacturing-language-models-turning-your-factory-data-into-ai-currency/ Mar 16, 2026 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2026/03/3578571-Manufacturing-Language-Model-Blog-BLOG.png

Your manufacturing data isn’t just sitting in databases—it’s currency. The question is whether you’re spending it or letting it collect dust. Every sensor reading, quality check, maintenance log, and production run represents potential value that most manufacturers leave untapped. Manufacturers that effectively leverage their data can reduce machine downtime by up to 50% and increase production efficiency by 10–20%. Yet the challenge isn’t collecting data—it’s making sense of the massive volume, format, and siloed information flowing through your operations every day.

Why We Need a Manufacturing Language Model (MLM)

U.S. manufacturers face a perfect storm of challenges, making 2026 a pivotal year. The skilled labor shortage continues to intensify, with the National Association of Manufacturers reporting that the industry could face a shortage of 2.1 million workers by 2030. Meanwhile, input costs remain volatile, tariffs, inflation, and technical debt from legacy systems accumulate, and the gap between operational technology (OT) and information technology (IT) creates data silos that prevent holistic decision-making.

The real problem isn’t any single challenge—it’s that your manufacturing data is fragmented. Your MES system knows production rates. Your ERP knows costs and inventory. Your quality management system tracks defects. Your maintenance logs predict failures. Your CRM and customer support platforms tell you everything you need to know about what your customers value, their challenges, and even issues with your product. But these systems don’t talk to each other, and the people who need insights are buried in manual data wrangling rather than solving problems or leveraging their domain-specific expertise to act. Operations leaders spend hours extracting data from multiple systems, while R&D teams can’t quickly correlate product design decisions with manufacturing outcomes. The manufacturing engineer who could innovate and act is instead drowning in spreadsheets.

This matters because your competitors are figuring this out. Manufacturing is becoming an information industry, and the companies that can contextualize their complex, multi-modal data fastest will win on cost, quality, and speed. This will become especially important as we transition from our current AI state to the future state of physical AI.

The Evolution: From Large Language Models to Something More Powerful

The AI landscape has evolved rapidly. Large Language Models (LLMs) like GPT-4 democratized access to artificial intelligence, showing us that models trained on massive text datasets could understand context and generate human-like responses. Then came Small Language Models (SLMs)—more efficient, focused models that could run on edge devices with lower computational costs.

Domain-specific language models pushed this further, training AI on specialized knowledge in fields like medicine, law, or specific engineering disciplines. Multi-modal models broke down barriers between data types, processing text, images, video, and structured data simultaneously. Now, the trend is toward a mixture-of-experts architectures that combine multiple specialized models, each excelling at different tasks, to deliver more comprehensive insights than any single model could achieve.

What makes this evolution significant for manufacturing is the recognition that the most powerful AI applications don't come from general-purpose models—they come from models that deeply understand your unique corporate context, combine diverse data types, and connect domain knowledge with real-time operations.

Introducing the Manufacturing Language Model

This brings us to what our team refers to as the Manufacturing Language Model (MLM)—a specialized AI framework that combines domain-specific manufacturing knowledge, company-specific operational context, and multi-modal data integration to transform how manufacturers access, understand, and act on their information.

An MLM isn’t just another analytics dashboard. It’s an intelligent layer that sits across your entire manufacturing ecosystem—understanding time-series sensor data from your factory floor, contextualizing it with quality metrics, correlating it with supply chain information, and connecting it to industry best practices and your company's institutional knowledge.

It speaks the language of manufacturing. For example, it understands what “cycle time” means in your specific context, recognizes the relationship between ambient humidity and coating defects in your process, and notes that when Line Three’s Temperature Sensor 7 drifts, it precedes bearing failure by 48 hours.

The transformative value comes from four capabilities working together:

  1. MLMs combine complex, multi-modal data—sensor readings, images, maintenance logs, operator notes, design specifications, and quality reports—without requiring everything to be structured the same way.
  2. MLMs connect data across your entire ecosystem. An automotive supplier can correlate upstream raw material variations with downstream warranty claims. A pharmaceutical manufacturer can trace batch genealogy while simultaneously analyzing environmental conditions and operator certifications. 
  3. MLMs layer industry knowledge, company-specific processes, and role-specific context onto your data. They don’t just show a quality engineer that reject rates increased—they contextualize it with similar historical patterns, suggest root causes based on process knowledge, and recommend investigations based on what worked before.
  4. Last but most importantly, MLMs empower your people to operate in their value-add roles. Your operations manager doesn’t become a data analyst—they ask questions in natural language and get answers that let them deploy their creativity, intuition, and experience. “Why is Line Two underperforming this week?” gets a contextualized answer that considers changeovers, material variations, maintenance history, and operator scheduling—all without a single SQL query.

The Path Forward: Building Your MLM Strategy

Every manufacturing organization we engage at CNXN Helix shares a common aspiration: to establish data pipelines that break down silos, implement a data fabric that contextualizes information across systems, and deploy intelligence that empowers rather than replaces their people. The MLM concept represents the convergence of these goals.

Building an effective MLM requires three integrated efforts: developing a clear data strategy that prioritizes contextualization over mere aggregation, selecting and deploying the infrastructure to support real-time, multi-modal data integration, and implementing AI models specifically tuned to your manufacturing domain and company context.

This isn’t about replacing your organization’s knowledge—it’s about scaling it. CNXN Helix helps manufacturers create market differentiation by making their data accessible and actionable, shortening business lifecycles by accelerating decision-making from days to minutes, and leveraging AI unfairly by building competitive advantages that are deeply rooted in your specific operations and impossible for competitors to replicate quickly.

Your manufacturing data is currency. The question is, are you ready to invest in it?

CNXN Helix Center for Applied AI and Robotics partners with U.S. manufacturers to develop AI strategies that transform operational data into competitive advantage. Contact us to explore how manufacturing language models, vision AI, and other AI domain disciplines can revolutionize your operations.

]]>
Manufacturing’s Next Advantage: AI ready... https://community.connection.com/manufacturings-next-advantage-ai-ready-it-modernization-on-azure/ Mar 10, 2026 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2026/03/3561072-GTM-MIM-Manufacturing-Azure-BLOG.jpg

The U.S. manufacturing sector is at an inflection point—lag behind overseas competition and technology, or modernize to offset industry headwinds and thrive. Pressure to adopt artificial intelligence, integrate operational technology (OT) data with enterprise IT systems, and modernize legacy infrastructure is mounting—all while plant leaders face a deepening workforce shortage, relentless cost pressures, and global competition. For IT leaders, CTOs, and Operational Leaders, the question is no longer whether to modernize, but how to do it strategically without disrupting the production lines that keep revenue flowing.

The stakes are staggering. According to a study by Deloitte and The Manufacturing Institute, the U.S. manufacturing skills gap could leave 2.1 million jobs unfilled by 2030, at a potential cost of $1 trillion that year alone. Meanwhile, the Flexera 2025 State of the Cloud Report reveals that 27% of cloud spending is wasted across organizations, and 84% cite managing cloud spend as their top challenge. For manufacturers running hybrid environments with both plant-floor OT systems and enterprise cloud workloads, the complexity—and the opportunity—is immense.

The IT/OT Convergence Imperative

We are well behind the hype of IT/OT convergence. Today’s manufacturing environments generate massive volumes of data from sensors, PLCs, SCADA systems, and IoT-connected equipment. Yet much of this data sits in silos, disconnected from the enterprise IT systems that could turn it into actionable intelligence. And with real AI-based solutions tackling the challenge of contextualizing and bringing disparate data sources together for real value, it’s now mission-critical.

According to BCG, adoption of converged IT/OT technologies for new manufacturing projects is expected to accelerate from roughly 10% today to approximately 50% within the next five years. Manufacturers integrating operational data with cloud analytics can unlock predictive maintenance that lowers maintenance costs by 25–40% and cuts unexpected downtime by up to 30%. But convergence also expands the attack surface—converged IT/OT systems were targeted in 75% of cyber incidents impacting manufacturing firms in the past year.

This is precisely where a well-architected Azure hybrid cloud strategy—anchored by a unified data platform—becomes essential. The challenge most manufacturers face is not a lack of data; it is that data from OT systems (historians, SCADA, MES, IoT sensors) and IT systems (ERP, CRM, supply chain management) lives in entirely separate ecosystems with different protocols, formats, and refresh cycles. Without a common data layer, plant-floor insights never reach the decision-makers who need them, and enterprise priorities never inform operational execution.

Microsoft Fabric addresses this gap directly. As an end-to-end analytics and data platform, Fabric allows manufacturers to ingest, unify, and govern data from both OT and IT sources—landing it in a common data fabric that serves as a single source of truth across the enterprise. Plant-floor sensor data, production batch records, quality inspection results, ERP transactions, and supply chain signals can all be brought together in one governed environment.

Once this data fabric is in place, the strategic value compounds across three layers:

  • Business Intelligence and Reporting: Unified data enables real-time dashboards and operational reporting that span the plant floor to the executive suite. Operations leaders gain visibility into OEE (Overall Equipment Effectiveness), yield rates, energy consumption, and supply chain status—without waiting for manual data reconciliation between IT and OT teams.
  • AI Model Training and Language Models: The same governed data fabric becomes the foundation for AI applications. Manufacturers can train machine learning models for predictive maintenance, quality forecasting, and demand planning using clean, contextualized data that combines OT telemetry with IT business context. Large language models (LLMs) can be fine-tuned on internal process documentation, maintenance logs, and engineering specifications to deliver domain-specific answers to operators and engineers.
  • Agentic AI Operations: Looking forward, the data fabric positions manufacturers to deploy agentic AI—autonomous agents that can monitor production lines, identify anomalies, trigger maintenance work orders, adjust scheduling, and even communicate with suppliers. These agents require broad, trustworthy access to both operational and enterprise data to act autonomously, which is exactly what a unified fabric provides.

Alongside Fabric, Azure Local gives manufacturers consistent management and governance across public cloud and on-premises environments—including edge locations on the plant floor—so OT data can flow securely into the fabric without leaving the network perimeter when it shouldn’t. Microsoft Defender for IoT provides asset discovery, vulnerability management, and threat protection purpose-built for industrial control systems, protecting the expanded attack surface that comes with IT/OT convergence.

AI Readiness: The Manufacturing Opportunity

AI adoption in manufacturing is accelerating rapidly. A 2025 Rootstock survey found that 77% of manufacturers adopted AI in 2024, up from 70% in 2023. Manufacturers applying machine learning are 3x more likely to improve their key performance indicators, according to McKinsey. The applications are tangible: predictive quality control with 90% defect detection accuracy, AI-driven energy management delivering average energy savings of 12%, and production facilities reporting a 78% waste reduction rate.

But getting AI-ready requires the right infrastructure foundation. AI workloads demand enormous compute, storage, and data integration capabilities. Connection’s Azure Cloud Services help manufacturers build this foundation through structured engagements—starting with Cloud Strategy and Envisioning Workshops to align IT and operations stakeholders on objectives, followed by the Azure Well-Architected Framework Review to evaluate architecture against security, cost, reliability, performance, and operational excellence pillars. An Azure Landing Zone provides the pre-built, governed environment manufacturers need to migrate workloads efficiently, while Microsoft Fabric—as described above—serves as the analytics and AI backbone, enabling global access to pre-built AI models and advanced analytics critical for turning plant-floor data into actionable intelligence.

Bridging the Workforce Gap with Smarter Infrastructure

The workforce challenge compounds every other pressure that manufacturers face. The World Manufacturing Foundation reports that 74% of companies face an acute shortage of skilled workers. The World Economic Forum’s 2025 Jobs Report projects that roughly 40% of core manufacturing skills will change in the next three to five years, and more than 54% of incumbent workers will need additional training by 2030. IDC predicts 90% of organizations will be impacted by the IT skills shortage by 2026, costing $5.5 trillion in product delays, quality issues, and revenue loss.

This is where a managed services partner delivers outsized value. Connection, as a certified Azure Expert MSP—a distinction held by fewer than 1% of Microsoft partners worldwide—augments lean IT and operations teams with deep cloud expertise, eliminating the need to recruit scarce talent in a historically tight market. Through Azure Managed Services, Connection provides proactive monitoring, managed detection and incident response, compliance reporting, and FinOps-driven cost optimization. Teams can shift from reactive firefighting—the New Relic Observability Forecast 2025 reports that 30% of IT operations time is consumed by emergencies and interruptions—to strategic work that supports plant productivity and innovation.

Turning Complexity into Competitive Advantage

The convergence of hybrid cloud, AI, OT integration, and workforce pressures creates real complexity—but also an extraordinary opportunity for manufacturing leaders who act strategically. The path forward involves controlling cloud costs with FinOps discipline, accelerating AI readiness by modernizing and rationalizing workloads, strengthening security across converged IT/OT environments, and augmenting lean teams with expert partners.

Connection’s approach starts with understanding your business goals and cloud readiness, not a one-size-fits-all lift-and-shift. As one of the few firms globally to have achieved the Microsoft Solutions Partner designation in all six solution areas, Connection brings end-to-end capabilities—from assessment and migration to optimization and 24x7 managed services—that help manufacturing organizations turn infrastructure into genuine competitive advantage.

Ready to transform your manufacturing IT strategy?

Talk to a Connection Azure Expert today.
1.800.998.0067

Connection is a certified Microsoft Azure Expert MSP and Microsoft Solutions Partner in all six solution areas. Learn more at www.connection.com/azure

]]>
Getting More from Microsoft 365 Without... https://community.connection.com/getting-more-from-microsoft-365-without-adding-complexity/ Mar 09, 2026 Christy Burton https://community.connection.com/author/christy-burton/ https://community.connection.com/wp-content/uploads/2026/03/355471-Saas-Microsoft365-BLOG.png

Most organizations don’t struggle with buying Microsoft 365. They struggle to consistently realize its value.

In a common scenario, IT assigns licenses, enables the right apps, and encourages teams to use Teams and store files in SharePoint. But without shared standards, day-to-day work quickly fragments and the platform starts to feel more complex instead of simpler.

Over time, collaboration habits diverge across departments. Information gets fragmented, follow-up gets uneven, and teams settle into different ways of completing the same work, resulting in inconsistency that complicates oversight and support.

In practice, value gaps in cloud platforms usually come from unclear operating models, not missing features. If no one agrees on where work happens, how information is shared, and how tools fit together, adoption spreads without standardization.

This article outlines practical ways to simplify collaboration, strengthen security and compliance, control costs, and prepare your environment for Copilot, along with guidance for managing continuous Microsoft 365 updates over time.

Microsoft 365 as a Unified Platform

Microsoft 365 delivers the most value when teams run it as one operating model instead of a set of separate apps. Each type of work should have a clear default home base, so employees spend less time deciding where something belongs and IT reduces time supporting workarounds.

In a unified ecosystem, identity, files, meetings, and teamwork connect by default. That doesn’t mean every feature gets used. It means the organization defines a few standard choices and documents them so that collaboration stays predictable as usage grows.

Many organizations roll out Teams, SharePoint, and OneDrive successfully but never define how those tools work together. When standards stay undefined, teams fill the gap with personal preferences. Over time, that ambiguity creates friction and tool sprawl.

How to Unify the Microsoft 365 Ecosystem

Document where work happens for core collaboration motions:

  • Chat (Microsoft Teams): Use channels for group and project work and keep 1:1 chat for informal conversations.
  • Meetings (Microsoft Teams): Use one standard meeting experience for scheduling, joining, and— where enabled—recordings and transcripts.
  • Files (SharePoint and OneDrive): Store team and project content in SharePoint. Use OneDrive for individual working files, not long-term records.
  • Co-authoring (Office documents stored in SharePoint or OneDrive): Work from a single source of truth with version history.
  • Tasks (Planner and/or To Do): Pick a default approach for task tracking and meeting follow-ups.

A practical first step: Publish three to five collaboration standards to reduce tool sprawl—for example, when to use channels versus email, where project files live, and how teams are named. Clear defaults reduce guesswork, cut duplicate work, and lower support effort by giving everyone clear guardrails for how work gets done.

Microsoft’s Teams admin resources and Microsoft 365 documentation emphasize establishing clear defaults for chat, meetings, files, and tasks.

Enhanced Communication and Collaboration

Most collaboration friction comes from inconsistency. When teams don’t share expectations for meetings, files, and follow-up, information spreads across inboxes, chats, and documents with no clear owner.

Predictable collaboration reduces rework, shortens meetings, and makes onboarding easier.

Build Repeatable Collaboration Patterns

The fastest improvements usually come from standardizing common activities:

  • Meetings: Share an agenda in advance, capture notes and decisions in one place, assign action items with owners and due dates, and use a consistent approach to recordings and transcripts where enabled.
  • File collaboration: Use shared ownership rather than personal silos, rely on version history instead of email attachments, and set clear rules for external sharing.
  • Cross-team work: Enable guest access where appropriate, and define what guests can see and do.

Example Workflow: From Meeting to Follow-through

A practical standard is to define completion at the end of every meeting. A Teams meeting starts with a shared agenda, the group captures notes and decisions in the same workspace, and owners receive follow-up tasks with due dates. Supporting files stay in the linked SharePoint site, and discussion continues in the related Teams channel rather than shifting to email.

The value comes from continuity. Agendas, notes, tasks, and files stay connected before, during, and after the meeting, so context stays intact and ownership stays clear. Microsoft supports this workflow through Teams and Loop collaborative notes, where shared agendas and notes can persist across Microsoft 365 apps, including Outlook, Word, and Planner.

Enterprise-grade Security and Compliance

Security in Microsoft 365 doesn’t have to reduce usability. The biggest gains usually come from setting a clear baseline first and then adding advanced controls only where they are needed. Strong security starts with sensible defaults that protect access and data without creating unnecessary friction.

The Non-negotiables

A practical baseline focuses on three areas:

  1. Identity protection
  2. Require multi-factor authentication for all users.
  3. Use conditional access to apply tighter rules when sign-in risk is higher, such as unfamiliar locations, risky sign-ins, or unmanaged devices.
  4. Apply least-privilege access, which means people get access only to what they need for their role, and admin permissions stay limited, time-bound where possible, and reviewed regularly.
  5. Device posture
  6. Define what counts as a trusted, managed device and what does not.
  7. Allow full access from compliant devices, and limit or block access from unknown or non-compliant devices, especially for downloading or syncing data.
  8. Data protection
  9. Identify which information needs extra protection and apply sensitivity labels accordingly.
  10. Align retention settings to business and regulatory requirements so critical content is kept, discoverable, and disposed of on a defined schedule.

For regulated environments, documentation matters as much as configuration. Controls should be easy to explain, repeatable to apply, and defensible during audits.

Helpful References

To go deeper on the baseline controls in this section, the Microsoft Zero Trust guidance, Microsoft Secure Score, and the NIST Cybersecurity Framework provide practical reference points for access, configuration posture, and risk management.

Cost Efficiency and Scalability

Microsoft 365 costs don’t usually jump all at once. They build up gradually over time. Licenses get added for new hires, plans get upgraded to solve one-off needs, and overlapping tools stay in place longer than intended. Without regular visibility into real usage, spend slowly separates from how teams actually work.

Common drivers include duplicate tools that overlap with Microsoft 365 capabilities, underused licenses that stay assigned, and plans that no longer match day-to-day requirements. A simple, repeatable process can keep spend under control:

  • Inventory licenses versus usage: Compare what is assigned to actual activity and critical workflows. Microsoft 365 admin activity reports and usage analytics can help surface underused licenses and low-adoption services.
  • Identify overlaps to consolidate: Flag third-party tools that duplicate Microsoft 365 features, and then decide what to retire, what to keep, and what needs a standardized alternative.
  • Set a right-sizing cadence: Review licensing quarterly or biannually, and tie the process to onboarding, role changes, and offboarding so adjustments happen routinely instead of reactively.

As organizations grow, cost discipline matters more. Governance that worked at a smaller headcount often needs clearer ownership, documented standards, and a defined review cycle. Security policies typically require tighter segmentation and enforcement, and the support model often shifts from informal help to standardized requests, escalation paths, and reporting. Regular license and tool reviews keep licensing aligned to real usage, reduce tool overlap, and support growth without compounding cost.

Connection’s Microsoft Landscape Optimization (MLO) engagement helps organizations turn Microsoft 365 from a collection of licenses into a streamlined, value‑driven platform. By assessing the full Microsoft environment—usage, licensing, security, and configuration—Connection identifies opportunities to simplify operations, eliminate waste, and align tools to real business needs. The result is improved efficiency through better adoption and standardization, reduced costs by right‑sizing and optimizing licenses, and the ability to scale confidently as business needs evolve.

AI-powered Productivity with Copilot

Microsoft 365 Copilot can deliver meaningful productivity gains, but only when the environment is ready. Copilot works within existing permissions and data boundaries. If content is scattered, overshared, or poorly governed, Copilot results become inconsistent and harder to trust. Clean data locations, disciplined access control, and clear guardrails make Copilot both useful and safe.

A Simple Copilot Readiness Checklist

Before scaling Copilot, organizations tend to focus on these basics:

  • Confirm content lives in SharePoint, OneDrive, and Teams rather than unmanaged repositories.
  • Clean up permissions, broad access groups, and anonymous or outdated sharing links.
  • Define sensitivity labels and data boundaries for critical information.
  • Establish acceptable-use guidance and training so users know what Copilot can and cannot be used for.
  • Pilot Copilot with a defined cohort and clear success measures before expanding access.

Connection’s Microsoft 365 Copilot Envisioning Workshop is built to help organizations take the first practical steps toward AI at work—starting with clarifying needs, identifying priority Copilot scenarios, and mapping an actionable path forward. The engagement is structured to assess your environment, explore the “art of the possible,” and then build a plan that aligns Copilot capabilities to real business outcomes—while also highlighting readiness considerations and optimization opportunities tied to data, access, and security.

Connection also offers a Microsoft 365 Copilot Technical Readiness Assessment, designed to ensure the groundwork is in place for a successful rollout—because Copilot value depends on preparation. Our team of cloud experts evaluates your readiness, documents findings, and provides recommendations you can act on, enabling more informed decisions and a clearer, safer route to deployment—ultimately supporting a smoother implementation experience.

Safe Rollout Principles

Start with a measured pilot and treat Copilot as a program instead of a setting. Assign access to a defined group, review usage and feedback, and adjust governance controls before expanding. Clear ownership, readiness checks, and ongoing enablement reduce risk while improving the quality of Copilot output.

Continuous Improvement and Updates

Microsoft 365 is a continuously evolving cloud platform, and new features, security changes, and lifecycle updates are part of normal operations. Organizations that plan for change experience fewer surprises and higher adoption.

How to Build a Simple Operating Rhythm

Microsoft 365 changes stay manageable when every update goes through the same three-step decision path: communicate it, pilot it, and measure it. This keeps effort proportional and reduces the risk of changes breaking everyday workflows.

  • Change communications: Summarize what is new, what is changing, why it matters, and what users need to do, if anything.
  • Testing and rollout: Use pilots or rollout rings for higher-impact changes, and then expand based on results.
  • Feedback and measurement: Track adoption, usage patterns, and recurring friction points, and then adjust standards and training accordingly.

Handled well, continuous updates become an advantage rather than a disruption.

A Simple Microsoft 365 Optimization Plan

Getting more value from Microsoft 365 rarely requires a wholesale redesign. For most organizations, progress comes from a focused approach that brings clarity without disrupting day-to-day work.

A practical model follows three phases:

  1. Assess: Review how collaboration actually happens today, where content lives, how security controls are applied, and how licenses are being used.
  2. Standardize: Establish clear collaboration standards, apply a baseline security and compliance posture, and document governance decisions so expectations stay consistent across teams.
  3. Enable: Support users with guidance and training, pilot Copilot with appropriate guardrails, and measure adoption so improvements hold over time.

When these elements work together, outcomes become tangible: fewer fragmented tools, clearer collaboration patterns, stronger security posture, better cost control, and a safer path to Copilot value. For organizations that want a practical starting point, a Microsoft 365 assessment can identify where to simplify collaboration, strengthen security, right-size licensing, and prepare for Copilot based on how the business operates. Connection can support that assessment and help translate findings into a prioritized plan that improves collaboration, security, licensing, and Copilot readiness without overengineering the platform.

]]>
Choosing An AI PC for Business: A Practical... https://community.connection.com/choosing-an-ai-pc-for-business-a-practical-guide-to-intel-core-ultra-processors/ Feb 27, 2026 Michael Robie https://community.connection.com/author/michael-robie/ https://community.connection.com/wp-content/uploads/2026/02/3538374-Intel-Multi-Phase-Campaign-BLOG.jpg

With Windows 10 support having ended in 2025, many organizations are already refreshing laptops and reassessing device standards across a mixed workforce. At the same time, AI features are increasingly running on the device itself, which means processor choice now has a direct impact on performance, battery life, and the AI experiences users can access.

For business buyers, that puts new weight on selecting the right Intel® Core™ Ultra processor. Choosing the wrong tier can lead to over-provisioned devices for everyday users, underpowered devices for demanding roles, and avoidable variation across the device fleet.

This article looks at the Intel® Core™ Ultra lineup through a business lens, with a focus on choosing the right processor for different workplace roles.

What Is An AI PC? (And What It’s Not)

Before comparing Intel® Core™ Ultra processors, it helps to clarify what “AI PC” actually means in a business context. An AI PC is a computer designed to run AI workloads locally using a dedicated Neural Processing Unit (NPU), alongside the CPU and GPU. The NPU is purpose-built to handle AI tasks efficiently on the device, which can improve responsiveness, battery life, and data privacy by reducing reliance on cloud processing.

TOPS (trillions of operations per second) measures the theoretical AI compute capacity a processor can deliver. How that compute is used depends on the workload. More demanding (or short-duration) AI workloads typically run on the GPU, while sustained, low-power AI tasks are well suited to the NPU. Higher available TOPS generally enables more capable and more consistent on-device AI experiences as operating systems and a growing number of applications increasingly take advantage of AI processing on the PC.

Why “AI-capable” Can Mean Different Things

Not every system described as “AI capable” works the same way. Some rely primarily on cloud-based AI services. Others use GPUs to accelerate specific workloads. A modern AI PC includes a dedicated NPU designed for sustained, efficient on-device AI processing.

AI PC vs. Copilot+ PC

These terms are often used interchangeably, but they describe different things.

An AI PC refers broadly to systems designed to run AI workloads on the device using a combination of the CPU, GPU, and a dedicated NPU, depending on the type of workload.

Copilot+ PC is Microsoft’s defined category for a subset of AI PCs that meet specific hardware requirements to enable certain on-device Windows AI features. These requirements include an NPU capable of 40+ TOPS, at least 16GB of memory, and 256GB or more of local storage.

Most organizations start by deciding which AI-enabled Windows experiences matter for different roles. In cases where those experiences depend on Copilot+ PC–class hardware, selecting devices that meet those requirements becomes important. In other cases, many AI PCs can still deliver meaningful on-device AI benefits without needing to meet that specific category.

AI Capability Goes beyond CPU and GPU

A faster CPU or GPU does not automatically translate to stronger on-device AI performance. For many AI features built into Windows, the NPU—and its TOPS rating—is the key specification that determines what experiences a device can support.
How to Read Intel® CoreUltra Names

Intel® Core™ Ultra naming can look complex at first glance, but there’s a simple way to make sense of it when evaluating business laptops.

  • Tier: Ultra 5, Ultra 7, Ultra 9
    The tier indicates the overall performance class. Higher tiers typically provide more compute headroom and are better suited for heavier workloads.
  • Series / Generation
    Newer processor series often introduce platform-level improvements, such as gains in efficiency, graphics, and AI acceleration, while connectivity advances depend on the overall platform and adapter choices.
  • Suffix Letters (the fastest way to judge fit)
    The suffix letter indicates the type of laptop the processor is designed for:
    • V: Thin-and-light designs optimized for mobility and battery life; built on Intel’s Lunar Lake architecture (officially the Intel® Core™ Processor Series 2)
    • U: Mainstream productivity laptops
    • H: Performance-oriented systems
    • HX: Workstation-class laptops

These suffixes matter because they signal the likely balance between battery life, thermals, and sustained performance—often more clearly than tier alone. If you remember one thing, the suffix letter usually tells you more about real-world fit than the processor tier.

The Intel® Core Ultra Lineup: Which Fits What Type of Business User?

Choosing the right processor isn’t just about performance numbers. What matters most is how each processor family is designed to support different ways people actually work. The Intel® Core™ lineup reflects these distinctions, making it easier to align performance, mobility, and efficiency to specific business roles instead of guessing based on specs alone.

Intel® Core™ Ultra 200V

Best Fit:
Executives, sales teams, consultants, managers, and most knowledge workers who spend their days in meetings, collaboration tools, and productivity applications.

What You Get:

  • Best-in-class battery life in thin-and-light designs
  • Fast, responsive performance for everyday business workloads
  • Efficient on-device AI acceleration via the NPU

Considerations:

  • Not designed for sustained heavy compute or workstation-class workloads
  • Intel® vPro®-based SKUs ensure consistent manageability and security across OEMs

Why It Stands Out:

For many organizations refreshing laptops at scale, Intel® Core™ 200V is the simplest and most practical default for mobile-first business users. It delivers AI readiness, portability, and efficiency while keeping fleet standards simple and consistent.

Intel® Core™ Ultra U

Best Fit:
Standard office productivity roles and broad-based fleet deployments.

What You Get:

  • Balanced performance and power efficiency
  • Broad availability across mainstream business laptops
  • Predictable fit for mixed-use workloads

Considerations:

  • Less headroom for sustained multitasking compared to H-class systems

Intel® Core™ Ultra H

Best Fit:
Analysts, developers, and users running heavier multitasking or content creation.

What You Get:

  • Higher sustained performance than U-class processors
  • Better fit for demanding applications
  • Improved graphics performance with integrated Intel® Arc™ graphics

Considerations:

  • Typically thicker device designs
  • Shorter battery life compared to thin-and-light efficiency-focused laptops

Intel® Core™ Ultra HX

Best Fit:
Engineering, CAD, advanced analytics, and data science workloads that require sustained CPU performance, with discrete graphics used where applicable to augment integrated graphics for intensive 2D and 3D workloads.

What you get:

  • Workstation-class performance
  • Support for sustained, workstation-class compute workloads with discrete GPU configurations as required

Considerations:

  • Larger, heavier devices
  • Higher power and thermal requirements; best suited for users who consistently need sustained high-performance compute

Don’t Stop at the Processor:
5 Specs That Keep AI PC Purchases from Backfiring

Choosing the right processor is an important first step—but it’s not enough on its own. Many AI PC deployments fall short not because of the CPU choice, but because supporting specifications weren’t aligned to how devices are actually used at scale. These five considerations have the biggest impact on performance, manageability, and long-term value.

1. Memory (RAM)
Memory plays a major role in how well AI-assisted features and modern Windows workflows perform over time.

  • 16GB is the minimum baseline for Windows 11, collaboration tools, and AI-assisted productivity features
  • 32GB or more is worth standardizing for power users, heavier multitasking, and organizations planning longer refresh cycles

2. Storage
Storage capacity affects both performance and usability, especially as applications and local data footprints grow.

  • 256GB fills up quickly once business applications, updates, user files, and local AI models accumulate
  • 512GB or more is a safer standard for business fleets and helps reduce support issues over the device lifecycle

3. Connectivity
Connectivity directly affects productivity, especially in hybrid and mobile work environments.

  • Standardize on Wi-Fi 6E or newer to support faster, more reliable connections in dense office or campus environments, where supporting access point infrastructure is in place or planned.
  • Thunderbolt™delivers a consistent standard for docking and peripheral connectivity while reducing desk-side complexity

4. Security and Manageability
For business fleets, security and manageability are just as important as performance.

  • Hardware-backed security features help protect data and identities at the device level
  • Choose devices that align with your existing device management and deployment approach—platforms like Intel® vPro® are considered best-in-class for enterprise security and manageability—to simplify rollout, updates, and future refresh cycles.

5. Form Factor and Battery Life
Even the best-specified device can disappoint if it doesn’t match how users work.

  • Match device size, weight, and battery life to travel frequency and meeting-heavy roles
  • Thin-and-light designs work well for mobile users, while performance systems may trade portability for power

By standardizing these specifications alongside processor choice, organizations can reduce support issues, improve user satisfaction, and avoid costly mid-cycle exceptions.

Planning Tools:

  • A Practical Framework for Choosing the Right AI PC
  • Once you understand how the Intel® Core™ Ultra lineup is segmented, choosing the right AI PC becomes less about comparing specs and more about matching devices to how people actually work. This decision framework helps teams align processor choice to real-world roles, reduce over-specifying, and keep fleet standards manageable.

Ask three questions:

  1. Is the user primarily mobile and collaboration-focused?
    Look at meeting-heavy schedules, daily use of productivity and collaboration tools, and the importance of battery life and portability.
  2. Does the user run sustained, compute-intensive workloads?
    This includes development, engineering, data analysis, or content creation that places continuous demands on the CPU or GPU.
  3. Does the role require specific on-device AI experiences?
    All AI PCs support baseline Windows AI features such as Windows Studio Effects including live transcription. More advanced on-device experiences, such as live translation and Click to Do, require Copilot+ PC–class hardware, which includes higher NPU performance and can influence processor selection.

Map the answers:

  • Default: Intel® Core™ Ultra 200V
    The right default for many mobile-first business users prioritizing battery life, responsiveness, and modern Windows experiences.
  • Mainstream fleets: Intel® Core™ Ultra U
    Well-suited for standard productivity roles where consistency, availability, and broad OEM choice matter.
  • Higher-performance needs: Intel® Core™ Ultra H
    A better fit for users who multitask heavily or run more demanding applications.
  • Workstation-class demands: Intel® Core™ Ultra HX
    Designed for engineering, CAD, and other workloads that require sustained CPU performance and, where applicable, discrete GPU acceleration.

By starting with a small set of user profiles and mapping them to the appropriate Intel® Core™ Ultra families, organizations can simplify processor selection, reduce device sprawl, and make the rest of the laptop configuration—memory, storage, security, and form factor—much easier to standardize.

Standardize First, Then Buy with Confidence

An AI PC is defined by practical, on-device capability. Intel® Core™ Ultra naming becomes easier to navigate once you understand tiers, generations, and suffix letters. For many organizations, Intel® Core™ Ultra 200V is the right default for mobile-first knowledge work, with U, H, and HX filling in as requirements become more specialized.

Next Steps:

]]>
TechSperience Episode 144: Future... https://community.connection.com/techsperience-episode-144-future-threats-not-yet-headlines/ Feb 13, 2026 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2026/02/3497571-TechSperience-Ep144-Security-Future-Threats-BLOG-BLOG.jpg

In this episode, the Connection Security Center of Excellence team looks beyond today’s headlines to uncover the threats most people won’t hear about until it’s too late. The team dives into the emerging risks shaping the next era of cybersecurity—from the looming reality of Harvest Now, Decrypt Later (HNDL) attacks, to the potential for identity collapse in a passwordless world, to the rise of autonomous AI agents as adversaries.

Rather than reacting to the news cycle, we explore what security leaders should be thinking about before the story breaks. If you want to understand the risks that are quietly forming on the horizon—and how they could redefine trust, identity, and digital resilience—this conversation is your early warning system.

Speakers:

John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection
Kimberlee Coombes, Security Solution Architect, Connection

Show Notes:

00:00 Introduction to Cybersecurity Challenges

01:27 Understanding Harvest Now, Decrypt Later

07:46 The Role of Identity in Security

14:08 The Rise of AI-driven Attackers

20:44 Planning for the Future of Cybersecurity

23:26 Mindset Shifts for Security Practitioners

]]>
It’s Time to Break Up with Your VPN https://community.connection.com/its-time-to-break-up-with-your-vpn/ Feb 10, 2026 Katie Springs https://community.connection.com/author/katie-springs/ https://community.connection.com/wp-content/uploads/2026/02/3492249-SonicWall-BLOG.jpg

VPNs Were Great, for a Time. But it’s Time to Put Yourself Back Out There.

There comes a time when every business has to ask itself, “Do I see myself spending the rest of my life with this VPN?” Sure, you’ve had some fun times, but does your VPN get you? When was the last time your VPN brought you flowers or made a reservation for a romantic seaside restaurant—wait, what is this about? Let me get back on track. When it comes to secure remote access, a lot of businesses are still leaning on old-school VPNs and just hoping for the best. Maybe they were good enough at one point. But now, with teams working everywhere, cloud apps dominating daily workflows, and cyberattacks hitting SMBs harder than ever, these aging tools just aren’t built for the moment.

Outdated SSL VPNs don’t just slow things down—they open the door to serious risk. If you’re tired of troubleshooting clunky clients and patching devices just to stay afloat, you’re not alone. And more importantly, there’s a better way.

VPNs Are a Headache (and a Security Risk)

Attackers know that VPNs are often the weakest link in your security stack. They count on poor patching habits, missing posture checks, and the fact that most VPNs don’t offer any form of granular access control. It only takes one compromised credential for an attacker to move laterally across your entire network—and that’s a problem no SMB can afford.

Let’s be real, VPNs are:

  • Slow: The bigger your team gets, the worse the performance becomes.
  • Hard to manage: Deploying and maintaining clients? Troubleshooting connection issues? It’s a time suck, especially for small IT teams.
  • Easily exploited: Attackers often target VPN vulnerabilities within 48 hours of disclosure. Patching takes much longer.

That’s not just frustrating. It’s dangerous.

A Better Way to Work Remotely

The good news? You don’t need to rip out your infrastructure to get a better remote access experience. SonicWall Cloud Secure Edge (CSE) offers VPN-as-a-Service (VPNaaS)—a simpler, smarter alternative to traditional VPNs that aligns with Zero Trust principles right out of the box.

With CSE’s VPNaaS, you can:

  • Ditch the client clutter: No more manual connections or complex installs. Users get fast, seamless access to what they need.
  • Strengthen your security: CSE verifies user identity and device health continuously—not just once at login.
  • Gain granular control: Only the apps and data a user needs. Nothing more. If something goes wrong, damage is contained.
  • Scale effortlessly: Whether you’re adding remote employees or growing your cloud footprint, CSE grows with you.

Start Simple—and Build Toward Zero Trust

Maybe you’ve heard of Zero Trust Network Access (ZTNA) but felt like it was too complex or too far off. That’s what makes VPNaaS such a smart starting point—it gives you immediate wins in security, manageability and performance without needing a complete overhaul. And as you grow, CSE grows with you—expanding into a full Zero Trust architecture at your pace.

Let’s make remote access a strength—not a liability. For more information about VPN-as-a-Service with SonicWall CSE, contact your Connection Account Team.

]]>
Celebrating 20 Years of Partnership at HIMSS26 https://community.connection.com/celebrating-20-years-of-partnership-at-himss26/ Feb 10, 2026 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2026/01/3482121-HIMSS26-Blog-BLOG.jpg

In 2026, Connection celebrates our 20-year Diamond sponsorship with HIMSS. Twenty Years. Two decades. Like any long-term relationship, we measure our success not solely by longevity, but by the quality of the partnership. HIMSS is far more than its annual conference. For Connection and our clients, HIMSS is the largest consortia of healthcare IT professionals from the payor, provider, and life science community, all of whom are committed to leveraging the power of technology to unleash high-quality, compassionate care that we want for ourselves and the people we love. From public policy advocacy to professional certifications and development, peer cohorts, and local chapter resources, HIMSS furthers its mission to “reform the global health ecosystem through the power of information and technology.”

More than a Theme

This year’s HIMSS theme, Expert Insights, Exceptional Impact, captures the urgency facing healthcare today. Rising costs, stagnant reimbursements, trust, and AI frameworks (or the lack thereof) are placing unprecedented pressure on our healthcare system. This year’s event also features robust content aimed squarely at those attendees looking for guidance on how to best integrate policy and legislation into their healthcare systems. Collaboration between different agencies, federal funding, and updates from CMS dominate the schedule. To learn more, follow the Government Connections Plaza Track. You’ll leave HIMSS as a confident agent for change in your organization.

HIMSS26 Booth Highlights

HIMSS26 will highlight Connection’s extensive offering as a Microsoft implementation partner. As an Azure Expert Managed Service provider, our solution architects will demonstrate why Connection has achieved our full suite of Microsoft Security Specializations. We will be showcasing how Microsoft and Connection are empowering healthcare organizations to strengthen security, modernize infrastructure, and unlock the full potential of their data. As cyber threats continue to escalate, Microsoft Cloud for Healthcare provides a comprehensive, Zero-trust-aligned security framework that protects patient information across identities, devices, networks, and workloads. Azure Managed Services further enhance resilience by helping payors and providers optimize and govern their cloud environments, freeing teams to focus on innovation and patient care. Combined with the power of Azure AI, healthcare organizations can securely analyze clinical data, predict trends, and personalize care at scale—all while maintaining strict industry standards for privacy and compliance.

We’re also excited to feature ConnecTrack, Connection’s purpose-built platform designed to simplify and optimize Microsoft licensing and infrastructure management. In today’s distributed IT landscape, gaining visibility into assets, budgets, and lifecycle needs can be challenging and time-consuming. ConnecTrack streamlines this complexity by centralizing software purchasing, usage insights, and lifecycle processes across programs such as Enterprise Agreement (EA), MSLA, and Smart Licensing. During HIMSS26, we’ll demonstrate how ConnecTrack delivers actionable intelligence with just a click, empowering healthcare IT leaders to make informed, strategic decisions that reduce TCO, increase agility, and support long-term digital transformation.

In addition to our Microsoft partnership, we will also be featuring our CNXN Helix Center for Applied AI and Robotics. With Helix Advisory Services, healthcare IT leaders can move beyond experimentation into practical adoption—learning to engineer safe prompts for clinical scenarios, build Copilot agents connected to EMR data, and apply compliance checks. We will also be featuring our Supply Chain and Lifecycle Solutions and Services that provide advanced tools designed to optimize procurement, asset management, and resource efficiency across the care continuum, ensuring critical assets are available where and when they’re needed through our Healthcare-in-a-Box solution.

Lastly, we will be hosting an exclusive in-booth reception with drinks and light refreshments for our valued customers on Wednesday, March 11 from 4:30–5:30 p.m. PT. If you are interested in attending, please reach out to your Account Team for more information.

Maximize Your HIMSS26 Experience

Let Connection curate your meaningful conference schedule! Work with your Account Team to schedule purpose-driven meetings, solution demonstrations, and networking opportunities with HIMSS26 exhibiting partners. This is a great opportunity to meet with the partners who matter most to your organization. Expect important announcements from the HIMSS community as they offer new solutions and partnerships live from the show.

Let’s Reimagine Healthcare IT at HIMSS 2026!

As we gather at HIMSS 2026, let’s embrace the opportunity to shape the future of healthcare IT together. Discover how our healthcare solutions and services are designed to enable better interoperability, advance analytics, and utilize AI-driven insights that improve both clinical workflows and patient outcomes to achieve strategic goals with confidence.Schedule a meeting with the Connection team in Booth #3519 during HIMSS 2026, and visit our event site to learn more.

]]>
The New Reality of Healthcare IT: Hybrid,... https://community.connection.com/the-new-reality-of-healthcare-it-hybrid-secure-and-built-to-last/ Jan 29, 2026 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2026/01/3472171-Healthcare-IT-Hybrid-BLOG.png

Healthcare IT leaders today are operating in an environment that’s nothing short of relentless. Budgets are lean, talent is stretched thin, and the stakes couldn't be higher: safeguarding patient data while supporting critical clinical workflows.

The days when technology decisions could be tactical are long gone. Now, they must be strategic, measurable, and resilient.

Hybrid Cloud Has Become the Default Architecture

When we talk about modern infrastructure in healthcare, we need to acknowledge something almost every organization has lived through: hybrid cloud has become the prevailing architecture for many health systems, and that’s not likely to change anytime soon.

Some workloads live in public cloud. Others stay on premises. And the value—the real operational value—comes from how well those environments work together.

Hybrid cloud also reflects practical decision-making:

  • Interoperability and integration needs push certain services into the cloud
  • Legacy EHRs and specialized systems often require on-premises infrastructure
  • Performance needs influence where workloads are best hosted

The goal isn’t “cloud first” or “on-prem always.” It’s making intentional choices about what belongs where, based on clinical operations, security, and agility.

Cloud Adoption Often Starts with Familiarity

For many healthcare providers, their path to cloud-based services started with Microsoft. Between Windows workstations, Microsoft 365, and Teams, the Microsoft stack is already embedded in day-to-day operations. That existing investment can help accelerate Azure adoption:

  • Familiar tools reduce adoption friction
  • Integrated security and compliance capabilities support HIPAA and other requirements
  • Native extensions for identity, governance, and threat detection help IT teams standardize and scale

For organizations already committed to Microsoft, Azure is often a logical next step because it fits their operational reality—not because it’s trendy, but because it just makes sense.

Security Is a Continuous Practice, Not a Project

Healthcare data is sensitive and mission critical. Cloud platforms provide baseline compliance capabilities and security tooling, but healthcare organizations still need a deliberate security strategy that reflects real-world risk.

Modern security practices for healthcare IT should include:

  • A Zero Trust mindset, where identity and access are continuously validated
  • Cloud-native controls that enforce guardrails, monitor compliance, and surface abnormal behavior
  • Reporting that supports both technical teams and executive stakeholders—especially when risk decisions require leadership visibility

Security and resiliency have become inseparable. You can’t afford to treat them as separate initiatives anymore.

Resiliency Has Expanded the Conversation Beyond DR

We used to talk about “business continuity” and “disaster recovery,” but the threat landscape has changed dramatically. Ransomware, outages, social engineering, and operational disruption have pushed healthcare organizations to think bigger. Resiliency requires planning for disruption, adapting quickly, and building layered defenses.

A resiliency approach often includes:

  • Redundancy for critical systems and workflows
  • Disaster recovery planning that aligns to operational priorities, not just technical specs
  • Cybersecurity practices integrated into infrastructure and application delivery

Resiliency also requires business owners at the table, not just IT teams. This isn’t just an infrastructure conversation, but an operational imperative.

Application Sprawl Drives Risk and Hidden Costs

Many mid-sized health systems operate hundreds, or even thousands, of applications. Some are essential. Others are legacy systems that have stayed in place because no one has had the time, staff, or budget to rationalize them. Without a clear strategy, organizations face:

  • Increased security exposure
  • Higher licensing and infrastructure costs
  • Operational complexity that slows change and innovation

A disciplined application lifecycle approach helps address this. Discovery, prioritization, modernization, and retirement are governance practices that strengthen security while freeing resources for higher-value work. It’s not glamorous, but it matters.

Measuring ROI Means Connecting IT to Outcomes

Healthcare organizations can’t justify new investments based on promise alone. They need measurable results. ROI can include dollars saved, but many of the most meaningful wins show up in outcomes like:

  • Reduced clinician downtime
  • Improved patient safety and experience
  • Faster deployment of new services
  • Reduced operational risk
  • Lower total cost of ownership through optimized consumption

When IT can tie infrastructure decisions to clinical and operational outcomes, leadership can evaluate investments with more confidence and more clarity. And that’s the conversation healthcare CIOs should be having with the C-suite.

The Human Reality Behind the Tech Stack

Technology decisions in healthcare always come back to people. Clinicians need tools that support real-world workflows. IT teams need solutions they can manage without burnout. Patients need secure, seamless experiences across every touchpoint, whether in person or virtual.

The infrastructure choices healthcare organizations make today—and the way those environments are governed—have a direct impact on care delivery, staff experience, and long-term sustainability.

Need Help Moving Forward?

Hybrid cloud, security, resiliency, and cost governance do not have to be tackled in isolation. Connection’s Healthcare Practice works alongside healthcare organizations to assess current environments, identify risk and optimization opportunities, and align Microsoft and Azure strategies to clinical, operational, and financial priorities.

We have built our healthcare practice around the realities our clients face, including limited resources, increasing regulatory pressure, and the need to do more with less. Our teams bring deep Microsoft expertise, healthcare-specific experience, and a practical approach focused on outcomes, not just infrastructure.Learn more about Connection’s Healthcare Solutions and Services and connect with your Account Team to discuss how we can help support your organization in the year ahead and beyond.

]]>
TechSperience Episode 143: Secure, Scalable,... https://community.connection.com/techsperience-episode-143-secure-scalable-streamlined-strategic-microsoft-solutions-for-healthcare-it-leaders/ Jan 28, 2026 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2026/01/3472371-TechSperience-Ep143-Microsoft-for-HC-BLOG.jpg

Healthcare organizations are navigating modernization under intense regulatory, security, and resource constraints. This episode explores how the Microsoft technology stack shows up differently in healthcare.

The conversation breaks down hybrid cloud realities, Azure managed services, security and compliance, business resiliency, disaster recovery, and cost optimization—all grounded in real healthcare use cases. The episode also explores how organizations can measure ROI beyond cost savings, connecting Microsoft investments to patient care, clinician experience, and operational resilience.

Speakers: Jennifer Johnson, Director of Healthcare at Connection

David Carey and Kevin Paiva, Senior Field Solution Architects at Connection

Show Notes:

00:10 Welcome and session overview
01:40 Why healthcare cloud adoption is different
02:10 Defining hybrid cloud in healthcare
03:00 Why hybrid is now the default model
03:55 Latency myths and performance realities
04:45 Which workloads belong on-prem vs. in the cloud
05:45 SaaS, staffing pressure, and infrastructure complexity
06:30 Azure managed services and the Connection approach
07:45 Comanaged Azure vs. fully outsourced models
08:30 Why Azure over other hyperscalers
09:20 Azure security, HIPAA, and Zero Trust
10:30 Azure Health Data Services
11:45 Business continuity vs. business resiliency
14:10 What healthcare leaders worry about most today
15:00 Disaster recovery and Azure Expert MSP
16:30 Post-pandemic resource constraints
17:30 Application sprawl, security, and identity management
18:50 Cost containment and ROI in healthcare IT
21:15 The teams behind the Connection Microsoft practice

]]>
IT in 2026: Navigating Change, Uncertainty,... https://community.connection.com/it-in-2026-navigating-change-uncertainty-and-the-new-talent-reality/ Jan 27, 2026 Patrick Dja Konan https://community.connection.com/author/patrick-dja-konan/ https://community.connection.com/wp-content/uploads/2026/01/3469522-IT-Talent-2026-BLOG.jpg

As IT teams head into 2026, the technology landscape continues to evolve at a pace that outstrips traditional hiring models. Artificial intelligence is accelerating change across nearly every function, but AI isn’t the only force reshaping how IT organizations operate. The bigger challenge for many IT leaders is uncertainty—uncertainty around which skills will be needed next, how long those skills will remain relevant, and how to build teams that can adapt without overcommitting headcount.

For many SMB and mid-market organizations, this uncertainty has driven a deliberate shift toward leaner IT teams. Not because demand has slowed, but because flexibility has become essential. Hiring full-time for every emerging skill set feels increasingly risky when technology cycles are shorter and priorities can change quarter to quarter.

Rather than locking into permanent roles too early, IT leaders are taking a more adaptable approach to workforce planning. Contract resources are being used to move quickly on critical initiatives and projects without long-term commitment. Contract-to-hire offers a practical way to validate both skill and fit before making permanent decisions. And managed services are helping teams cover ongoing or specialized needs while internal staff stay focused on higher-value, strategic work.

What’s changing in 2026 isn’t the importance of people—it’s how talent is deployed. The focus has shifted away from rigid org charts and toward flexible models that can evolve alongside the technology itself. Skills matter more than titles, and outcomes matter more than headcount.

The most effective IT leaders are building blended strategies that combine internal expertise with external support. This approach creates room to adapt as new tools, platforms, and capabilities continue to emerge—without slowing execution or increasing risk. As a strategic IT partner, Connection supports organizations with flexible IT staffing options, managed services, and a broad portfolio of IT solutions designed to help teams bridge gaps, stay agile, and move forward with confidence as priorities evolve.

]]>
How St. John’s Health and Connection Are... https://community.connection.com/how-st-johns-health-and-connection-are-transforming-community-healthcare-through-technology/ Jan 19, 2026 Kelly Kempf https://community.connection.com/author/kelly-kempf/ https://community.connection.com/wp-content/uploads/2026/01/3345071-Q4-St-Johns-Health-Customer-Success-BLOG.jpg

In the heart of Jackson Hole, Wyoming, St. John’s Health stands as a beacon of care for both its local community and millions of visitors each year. But what truly sets this hospital apart is not just its scenic location or its range of services; it is the way its IT and clinical teams collaborate to deliver secure, modern, and compassionate healthcare. I was fortunate enough to be able to sit down with two members of the St. John’s Health community, Tyler Wertenbruch and Emily Graham, at our recent Tech Summit, to discuss the partnership between St. John’s Health and Connection—and how we collaboratively solve St. John’s Health’s IT challenges.

Serving a Unique Community

St. John’s Health is a small community hospital nestled in the Tetons, offering 49 acute care beds, 52 long-term care beds, inpatient rehab, a transitional care unit, and 13 outpatient clinics. Emily Graham, Informatics Supervisor at St. John’s Health, says the hospital serves about 20,000 residents in the town and surrounding areas, and sees about 3 million visitors a year. This unique blend of residents and visitors requires the hospital to be prepared for everything, including seasonal increases in patients and unexpected emergencies.

IT Priorities: Security, Modernization, and Patient Experience

Managing healthcare IT requires careful coordination. Under the leadership of Tyler Wertenbruch, Director of IT, the team oversees a wide range of responsibilities, including helpdesk support, infrastructure, and system security. Their approach is guided by several priorities, with security consistently at the forefront. Every application and endpoint device is evaluated with a security-first mindset to ensure the protection of systems and data.

But security isn’t the only challenge. St. John’s Health also contends with technical debt, as years of equipment and services require modernization. Clinical needs are evolving, and patient expectations are higher than ever. Tyler’s team is focused on modernizing interfaces and bringing in solutions that make systems not only secure but also easier for providers to deliver patient care.

Business Continuity: Planning for the Unthinkable

Cybersecurity incidents are on the rise in healthcare, and St. John’s Health has felt the pressure. Emily shares, “We’ve really tried to shore up our business continuity plans and our patient care continuity plans in the event of a downtime.” The hospital has worked hard to prepare every department for downtime, mapping out systems and processes and running drills to ensure readiness.

Protecting Patient Data: Trust at the Core

At St. John’s Health, safeguarding patient data is more than just a technical concern; it is fundamentally about maintaining trust. Emily explains that the people they serve in their close-knit community include friends, neighbors, and family members. Failing to protect their data could lead to a significant loss of trust.

Clinical and IT Teams: Bridging the Gap

One of the hospital’s strengths is the close collaboration between IT and clinical teams. Emily’s informatics team serves as a bridge, converting clinical needs into technical solutions. Tyler adds that technology now plays a larger role, and the team is starting to identify the core priorities for clinicians, specifically the critical systems that must stay operational to ensure patient care.

As healthcare evolves, patients and providers from five different generations each engage with technology in unique ways. St. John’s Health is adopting ambient listening technology through their electronic medical records system, utilizing AI to simplify documentation processes. This innovation is helping providers provide patient-centric care and reclaim several hours each day, contributing significantly to patient satisfaction and clinician work-life balance.

Cost Containment and Optimization

Budget constraints are a reality in healthcare. Tyler describes how Connection has helped St. John’s Health evaluate tools, vendors, and spending, identifying opportunities to save money and reinvest in critical areas. “That constant iteration is super important,” Tyler says.

Continuous Security Improvement

St. John’s Health regularly engages Connection for security advisory services, including point-in-time and year-long evaluations. “We’ve made significant improvement, but still identified significant gaps, which gave us clear direction on how to prioritize efforts and invest funds,” Tyler explains.

Together, Connection and St. John’s Health focus on assessments that result in projects such as vulnerability testing, resolving misconfigurations, and updating firewalls, among other initiatives. Connection’s specialists not only expand the hospital’s expertise but also guide and support the internal team, helping them build their own capabilities.

This relationship is rooted in continuous collaboration rather than quick fixes. The focus is on long-term support, proactively identifying future needs, and regularly engaging with the security team to address ongoing and emerging challenges.

The Power of Partnership

The relationship between St. John’s Health and Connection has lasted for almost twenty years, demonstrating a strong dedication from both organizations to mutual objectives and principles. Tyler points out the commitment shown by Connection’s team members over time, explaining that the continuity among individuals across various teams truly resonates with him, as their genuine care for St. John’s Health’s mission is evident. Emily adds to this sentiment, saying that the partnership is so solid and supportive that it feels as though Connection’s team is exclusively focused on their organization. This ongoing alliance allows St. John’s Health to pursue steady progress, teamwork, and forward-thinking solutions. By working in close cooperation with Connection, the hospital can set new standards for security, operational effectiveness, and the patient experience, making certain that technology consistently supports the requirements of caregivers, patients, and the larger community.

To read more about Connection’s partnership with St. John’s Health, check out our recent case study. For more information about partnering on your next healthcare IT project, or to learn more about our solutions and services, visit https://www.connection.com/healthcare Interested in joining our next healthcare Tech Summit? Join us in July 2026 in Boston—contact your Account Team for more information.

]]>
State of Manufacturing: Navigating... https://community.connection.com/state-of-manufacturing-navigating-uncertainty-and-building-for-2026/ Jan 15, 2026 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2026/01/3454071-Manufacturing-IT-Trends-Blog-Post.jpg

American manufacturing stands at a crossroads. After weathering years of supply chain disruptions, inflationary pressures, and geopolitical uncertainty, the industry enters 2026 with cautious optimism tempered by hard-earned realism. The U.S. manufacturing sector experienced its eighth consecutive month of contraction in late 2025, with the ISM Manufacturing PMI holding below 50%. Global industrial output growth is projected to slow to 1.9% in 2026, down from 2.7% in 2025, according to Oxford Economics. Yet within this challenging environment, forward-thinking manufacturers are seizing the moment to fundamentally rethink their technical infrastructure, operational models, and competitive positioning.

What distinguishes this moment from previous downturns is the convergence of multiple transformative forces: artificial intelligence capabilities that have matured from experimental to operational, cybersecurity threats that have escalated from nuisance to existential, and a workforce crisis that demands technological solutions rather than incremental hiring strategies. Manufacturers are no longer simply seeking partners who can implement technology—they're looking for innovators who can help them navigate transformative change with speed, agility, and measurable impact. The complacency of traditional vendor relationships is giving way to demanding partnerships focused on total cost of ownership and rapid time-to-value.

Rethinking the Basics: Infrastructure for the Intelligent Factory The Necessity of IT/OT Convergence

The most critical infrastructure challenge facing manufacturers today is integrating operational technology (OT) with information technology (IT). According to Dragos' 2025 OT Cybersecurity Report, 70% of OT systems are projected to connect to IT networks within the next year, yet 75% of OT attacks begin as IT breaches. This convergence creates both unprecedented opportunity and significant vulnerability. Legacy OT systems—many designed before cybersecurity was a consideration—are being connected to networks and cloud platforms, exposing production environments to threats they were never engineered to withstand. Rockwell Automation's State of Smart Manufacturing report found that cybersecurity risks have become the third-largest impediment to growth, with more than one-third of manufacturers planning to strengthen IT/OT architecture security over the next five years.

The path forward requires rethinking the "control plane" entirely. Manufacturers need hybrid infrastructure capable of supporting both AI workloads and traditional manufacturing execution systems (MES), enabling innovation in R&D, production, and operations while maintaining the reliability that production environments demand. Cloud-native MES platforms, edge computing architectures, and unified data fabrics are replacing siloed legacy systems. Microsoft's Manufacturing Data Solutions, for example, offers ISA-95 compliant data models that unify factory-domain data from sensors, MES, ERP, and automation applications. The goal is simplified operations, improved observability, and dramatically shortened deployment timelines for new capabilities.

Cybersecurity: From Checkbox to Strategic Priority

Manufacturing has been the most targeted sector for cyberattacks for four consecutive years. The 2025 Verizon Data Breach Investigations Report reveals that ransomware accounts for 47% of all manufacturing breaches, with the industry experiencing 89% growth in verified breaches in 2024 and 125% increases in financial impact. Between 2024 and Q1 2025, Bitsight TRACE documented a 71% surge in threat actor activity targeting manufacturing, with 29 distinct groups actively exploiting the sector. The ENISA Threat Landscape 2025 report shows OT attacks now represent 18.2% of all cyberthreats, with 59.3% of manufacturing attacks attributed to cybercriminal organizations.

The financial stakes are staggering. Ransomware attacks on manufacturing have caused an estimated $17 billion in downtime costs over the past seven years. The industrial sector experienced the largest increase in average breach costs in 2024—rising by $830,000 per incident. Yet only 19% of organizations report feeling completely prepared to handle OT security issues. The solution requires treating cybersecurity not as a technical function but as a business continuity imperative: network segmentation between IT and OT environments, hybrid security operations centers (SOCs) that monitor both systems, OT-specific threat intelligence, and comprehensive asset inventories that include industrial IoT devices. Organizations must embed cybersecurity into digital transformation from the outset, not bolt it on afterward.

Building the Data Foundation

The transition to intelligent manufacturing requires a fundamentally different approach to data architecture. Manufacturers generate data from production equipment, shop floor sensors, ERP systems, MES platforms, CRM applications, and IoT devices—but this data typically remains siloed and underutilized. According to MarketsandMarkets, the data integration market will expand from $17.58 billion in 2025 to $33.24 billion by 2030, driven largely by manufacturing's need to unify edge, IoT, ERP, and MES systems in real time. The concept of a "data fabric"—a unified architecture that integrates structured and unstructured data across diverse systems—is moving from aspiration to implementation.
Data maturity directly determines AI readiness. Manufacturers with strong data foundations can expand AI use cases into predictive maintenance, quality optimization, and demand forecasting. Those without clean, integrated data find themselves unable to move beyond proof-of-concept pilots. The investment priority for 2026 should be establishing data pipelines that connect production systems with enterprise applications, enabling the real-time analytics and contextual intelligence that AI applications require. Edge computing capabilities are becoming essential as IoT sensors proliferate, processing data locally before transmission to reduce latency and enable immediate decision-making on the production floor.

AI for Differentiation: Beyond the Hype Cycle

Vision AI: Quality at Machine Speed

Visual AI has emerged as one of the most immediately impactful applications in manufacturing. The global AI visual inspection market reached $4.13 billion in 2024 and is projected to add $12 billion in revenue by 2033. Unlike traditional rule-based machine vision, AI-powered systems learn patterns from image datasets and identify anomalies even when they haven't been previously encountered. Siemens has reported 30% improvements in inspection accuracy, while Foxconn achieved an 80% improvement in defect detection rates. In automotive manufacturing, AI inspection systems have reduced defect escape rates by up to 83%. A 2025 Consumer Technology Association report indicates AI inspection systems now achieve 99.97% accuracy in detecting solder joint defects on printed circuit boards.

The value extends beyond defect detection. Vision AI systems can identify wear, cracks, and structural anomalies in real time for predictive maintenance, reducing unplanned downtime by up to 50%. Edge AI brings computation directly to the production line, enabling immediate decision-making without network latency. These systems detect defects in under 200 milliseconds, enabling real-time corrections that minimize error propagation. For manufacturers in regulated industries like medical devices and pharmaceuticals, AI inspection provides the traceability and audit documentation required for compliance—with FDA data showing facilities using AI inspection technology experienced 64% fewer quality-related recalls.

Conversational AI and Complex Data Interaction

Large language models are transforming how manufacturing teams interact with operational data. Microsoft's Factory Operations Agent, built on Azure AI Foundry, enables conversational interaction with manufacturing data at scale. Rather than requiring specialized queries or dashboards, production managers can ask questions in natural language and receive contextual answers synthesized from MES, ERP, sensor data, and quality systems. This capability is particularly valuable given the ongoing skilled worker shortage—enabling less experienced operators to access institutional knowledge and make informed decisions more quickly.

The integration of AI with existing enterprise systems—ERP, MES, PLM—enables predictive scheduling, automated quality checks, and dynamic resource allocation. Deloitte's 2026 Manufacturing Outlook notes that continuous integration of AI can lead to up to 40% downtime reduction and higher throughput. However, successful deployment requires treating AI as a product, not a project: establishing governance frameworks, implementing human-in-the-loop controls, and building the change management capabilities to transform workflows around AI capabilities rather than simply overlaying AI on existing processes.

Agentic AI: From Insights to Autonomous Action

The most significant AI development for manufacturing in 2025-2026 is the emergence of agentic AI—systems that don't simply analyze and recommend but take autonomous action. Deloitte projects agentic AI adoption in manufacturing will quadruple by 2027. Unlike traditional automation that follows rigid, pre-programmed rules, agentic systems reason through complex challenges, perceive context, and act independently across digital systems. According to McKinsey, companies extensively using AI in supply chains can improve logistics costs by 15%, inventory levels by 35%, and service levels by 65%. BCG reports that ServiceNow's AI agents are automating IT, HR, and operational processes, reducing manual workloads by up to 60%.

Practical applications are already delivering results. Agentic systems can monitor external sources to identify supply chain disruption risks, analyze cost-versus-delay tradeoffs for alternate suppliers, initiate contract negotiations, adjust production schedules, and update customers—all without human intervention. In quality management, AI agents access design files and bills of materials to generate work instructions, automatically revising them when engineering changes occur. For predictive maintenance, these systems analyze sensor data, predict equipment failures, schedule maintenance windows, order replacement parts, and adjust production schedules autonomously. The World Economic Forum notes that BMW is piloting humanoid robots for assembly tasks, while manufacturers report up to 25% reductions in energy costs through AI-driven resource optimization.

The Workforce Reality: Technology as Force Multiplier

The manufacturing workforce crisis has moved from urgent to structural. As of March 2025, approximately 449,000 U.S. manufacturing jobs remain unfilled, according to St. Louis Federal Reserve data. The Manufacturing Institute projects that 3.8 million manufacturing positions will open by 2033, but 1.9 million—nearly half—could go unfilled. The median age of manufacturing workers is 44.3 years, with 26% of the workforce age 55 or older approaching retirement. This isn't simply a hiring challenge; it's an economic and national security issue that technology must help address.

Smart manufacturing investments are increasingly justified not by headcount reduction but by workforce augmentation—enabling existing workers to accomplish more. Deloitte's 2025 survey found that 80% of manufacturers plan to invest 20% or more of their improvement budgets in smart manufacturing initiatives. The Manufacturing Leadership Council reports that 22% of manufacturers plan to deploy physical AI (autonomous robots, humanoid systems) within two years—more than double current deployment rates. Collaborative robots (cobots), AI copilots, and AR/VR training systems are being deployed to boost productivity and fill capability gaps. The goal is not replacing workers but extending their capabilities and making manufacturing careers more attractive to the next generation.

Partnering for Success: Why Going Alone No Longer Works

The complexity of modern manufacturing technology—spanning cybersecurity, data architecture, AI implementation, and IT/OT convergence—exceeds the internal capabilities of most organizations. Manufacturers are recognizing that successful transformation requires partners who bring deep domain expertise, proven implementation methodologies, and the resources to accelerate time-to-value. This means support for early-stage strategy development, skillset augmentation during implementation, and ongoing expertise to mature capabilities over time. The manufacturers achieving results are those who select partners based on demonstrated manufacturing experience rather than generic technology credentials.

Jumpstarting AI infrastructure and establishing data fabric foundations can deliver quick wins that build organizational confidence and fund broader initiatives. One company pivoting from an enterprise-wide AI vision to a focused vendor onboarding automation project cut onboarding time by 40% within three months, according to BCG—generating the proof points needed to fund larger-scale deployments. The lesson: start with clearly defined use cases that deliver measurable results, then expand. Partners should offer not just technology implementation but strategic advisory services that help manufacturers identify high-value opportunities and sequence investments for maximum impact.

Looking Ahead: Positioning for Competitive Advantage

The manufacturers who will thrive in 2026 and beyond are those making strategic investments now—not waiting for uncertainty to resolve. Economic headwinds may persist, but the organizations building strong data foundations, securing their IT/OT environments, and deploying AI for operational advantage will emerge from this period positioned to capture market share when conditions improve.

The Connection Manufacturing Practice brings together deep manufacturing domain expertise with comprehensive technology capabilities spanning AI, cybersecurity, data infrastructure, and intelligent automation. Our team comes from manufacturing backgrounds—we understand production environments, regulatory requirements, and the operational realities that determine technology success or failure. The CNXN Helix™ Center for Applied AI and Robotics helps customers move from the far edges to transformative AI applications that differentiate them from competitors. Whether you're looking to establish your AI strategy, strengthen IT/OT security, build a modern data foundation, or deploy vision AI and agentic automation, we offer the partnership model—from early strategy through ongoing support—that delivers results in complex manufacturing environments.

Sources and References:

Deloitte 2026 Manufacturing Industry Outlook
https://www.deloitte.com/us/en/insights/industry/manufacturing-industrial-products/manufacturing-industry-outlook.html

Manufacturing Institute State of the Workforce 2025
https://nam.org/the-state-of-the-manufacturing-workforce-in-2025-33321/

Rockwell Automation State of Smart Manufacturing Report
https://www.rockwellautomation.com/en-us/company/news/blogs/cybersecurity-trends-2025.html

Bitsight 2025 State of the Underground Report
https://www.bitsight.com/blog/inside-cyber-threats-in-manufacturing-2025

Dragos 2025 OT Cybersecurity Report / Zero Networks Analysis
https://zeronetworks.com/blog/ot-security-trends-2025-escalating-threats-evolving-tactics

McKinsey Agentic AI in Advanced Industries
https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/empowering-advanced-industries-with-agentic-ai

BCG: How Agentic AI is Transforming Enterprise Platforms
https://www.bcg.com/publications/2025/how-agentic-ai-is-transforming-enterprise-platforms

World Economic Forum: Why Manufacturers Should Embrace AI Agents
https://www.weforum.org/stories/2025/01/why-manufacturers-should-embrace-next-frontier-ai-agents/

U.S. Chamber of Commerce: Labor Shortage Data
https://www.uschamber.com/workforce/understanding-americas-labor-shortage-the-most-impacted-industries

NIST Manufacturing Extension Partnership: 2025 Predictions
https://www.nist.gov/blogs/manufacturing-innovation-blog/whats-coming-us-manufacturing-2025

Wipfli 2026 Manufacturing Industry Outlook
https://www.automationalley.com/2025/12/01/wipfli-2026-manufacturing-industry-outlook/

Voxel51: Visual AI in Manufacturing 2025 Landscape
https://voxel51.com/blog/visual-ai-in-manufacturing-2025-landscape

ENISA Threat Landscape 2025
https://blog.denexus.io/resources/enisa-threat-landscape-2025-ot-attacks-industrial-cybersecurity-crisis

Design News: AI in Manufacturing Set to Quadruple by 2027
https://www.designnews.com/automation/ai-based-agentic-systems-in-manufacturing-set-to-quadruple-by-2027

Microsoft Cloud for Manufacturing Data Solutions
https://learn.microsoft.com/en-us/industry/manufacturing/manufacturing-data-solutions/overview-manufacturing-data-solutions

]]>
From Chatbots to Agentic AI: The “Hockey... https://community.connection.com/from-chatbots-to-agentic-ai-the-hockey-stick-evolution/ Jan 14, 2026 Kelly Kempf https://community.connection.com/author/kelly-kempf/ https://community.connection.com/wp-content/uploads/2026/01/3451521-Agentic-AI-Experience-Blog-Post-1.jpg

A few months ago, I had the worst roadside assistance experience of my life. That’s a bold statement—but let me set up the scenario. I travel quite a bit for work, and over time I have created routines to prepare and execute my trips with minimal obstacles. Despite my meticulous planning, I occasionally forget something essential. And in this case, it was my ONLY key to my vehicle, which I left in ANOTHER city!  This brings us to the aggravating and time-consuming experience mentioned above.

A Roadside Assistance Headache Begins

Picture this situation. I arrive home late on a Friday evening and take the airport shuttle to the parking garage. When I walk up to my SUV, my proximity key doesn’t unlock the doors. Trying not to panic, I dig through my bags, but the key is nowhere to be found. I call my fiancé and explain the situation, reminding him that I have 24-hour roadside assistance, and let him know I will call him back with an update once I have more information.

Next, I dial the roadside assistance number posted on my car window. After navigating a few prompts, I reach a live representative and recount my predicament. The representative informs me that—because of my specific model—my vehicle will need to be towed to the nearest dealership to have a new key programmed. After confirming the details, they explain that a tow company will call shortly to verify my location and provide an estimated time of arrival. Though inconvenient, the towing service is covered, and I begin receiving automated text updates regarding the progress of my case.

When Automated Assistance Goes Wrong

Unfortunately, I assumed incorrectly this would be simply resolved. What followed next was an exhausting ordeal: it took a total of 37 phone calls—yes, you read that right—just to arrange for my car to be towed and to have a new key made. Throughout this process, I had to interact with four tow truck companies and three separate roadside assistance representatives. The situation required the dispatch of two types of tow trucks, and it led to two separate trips from the airport to my home and back again, as well as two visits to my car dealer. It was only through the support of one exceptional parts manager and a roadside assistance manager—whom I kept on speed dial—and a tremendous amount of patience over a span of 36 hours that I was finally able to get my car successfully towed and subsequent key made. While some complications could be attributed to the specific type of vehicle I own, most of the issues stemmed from communication system limitations. Although some of the tasks were automated, the lack of adaptability and interoperability resulted in frustrating loops and significant delays in achieving the final goal. Instead of streamlining the experience, the technology implemented created headaches for everyone involved. So, what’s next? Let's discuss ways organizations can overcome automation challenges and greatly enhance end-user satisfaction by leveraging the newest technological innovations in automated processes.

The Evolution of Intelligent Automation

The foundation for chatbot logic and the future concept of AI agents was established by Alan Turing in the 1950s, when he first proposed the idea of machine intelligence. The earliest chatbots emerged in the 1960s, building upon his theories. By the 1990s and early 2000s, the evolution continued with the rise of Robotic Process Automation (RPA). RPA brought rule‑based task automation into the workplace, such as in call centers and manufacturing, where it replaced basic repetitive human tasks with technology-driven solutions.

As the technology matured, the following decade saw the emergence of cognitive automation—RPA enhanced by machine learning and AI-based decision making. This shift enabled automated systems to move beyond simple call trees and allowed them to interpret emails, documents, and images, and even make basic probabilistic decisions. The result was widespread adoption of these solutions across industries such as healthcare, banking, insurance, and HR operations.

In the 2010s, significant advances in Natural Language Processing (NLP) led to the birth of Conversational AI, and conversational assistants such as Siri emerged. Advances in NLP and deep learning enabled chatbots to understand intent, maintain context, and engage in more natural dialogue.

Within the last 5 years, and the arrival of Large Language Models (LLMs) such as ChatGPT, systems gained the ability to generate human‑like text and code, summarize and synthesize information, and handle unstructured inputs at scale. Though these digital assistants became conversational and context-aware, they remained primarily reactive: they could respond to queries but lacked the ability to autonomously act across multi-step workflows. This limitation was evident in my roadside assistance scenario, when it took 37 phone calls for an “automated system” to properly communicate information to complete a simple task; it lacked adaptability for the unknowns.

The Rise of Agentic AI: Autonomous Execution and Adaptability

In the past year, further technological advancements have ushered in the era of Agentic AI. This new class of artificial intelligence marks a transformative leap in automation. Agentic AI systems are not limited to generating responses—they are capable of taking autonomous actions to execute complete workflows and achieve multi-step objectives.

The distinction between AI agents and traditional RPA is not solely technical but philosophical. RPA executes exactly what it's programmed to do—every action is predetermined and every decision hard-coded. In contrast, AI agents are provided with a goal and autonomously determine how best to accomplish it. With LLM reasoning, these agents can process ambiguous inputs, exercise judgment, and successfully navigate novel or unfamiliar scenarios. A current example of Agentic AI in action is its application within the Claims Management Cycle involving payors and providers.

Essentially, Agentic AI agents “learn and adapt” in ways that mirror human behavior. The integration of generative AI lifts automation beyond deterministic rules, enabling bots to interpret ambiguity, reason through uncertainty, and operate independently This level of adaptability is something traditional RPA could never achieve. Collectively, the convergence of these technologies signals a significant evolution: moving from the automation of discrete tasks to the automation of entire decisions, and ultimately, to the automation of outcomes.

Reflections on Technology Dependence and AI Implementation

This roadside assistance ordeal was a first world problem, but it prompted a deeper reflection on our dependence on technology, its reliability and the alignment of intent versus outcome. Working in the tech industry and assisting organizations with the implementation of technical solutions has given me a broader perspective on the far-reaching impacts these technologies can have. It is essential, as we plan and execute projects, to critically evaluate the limitations and scale of the tools we consider.

Gartner predicts 33% of enterprise software will incorporate agentic AI by 2028, and the market is projected to grow from $5.4 billion in 2024 to $47 billion by 2030. Agentic AI, though relatively new, will become part of the enterprise toolset fairly quickly. Effective planning for Agentic AI integration requires careful documentation of necessary tasks and scenarios that might arise during an interaction. It is crucial to involve the appropriate stakeholders, ensuring that we identify proper workflows and areas where flexibility is needed to accommodate exceptions or implement overrides. And recognizing when and how to introduce human intervention into automated and AI-driven processes is a vital part of this planning.

The Promises of Agentic AI: Designing Adaptive Systems for Optimal User Outcomes

Ultimately, our goal is to streamline workflows and improve performance or output for end users—whether they are knowledge workers, clinicians, or patients. Achieving this requires more than simply deploying advanced technologies; it involves a thoughtful approach to system design that prioritizes user experience and reliability.

To avoid negative experiences, it is essential to leverage adaptive technology, like Agentic AI, to design systems with flexibility in mind, ensuring they can adapt to a wide range of scenarios and user needs. However, technology alone cannot address every possible situation. That is why it is crucial to incorporate mechanisms for human involvement—allowing for human in the loop interventions when automated processes encounter exceptions or unanticipated challenges.

By combining the adaptability of trainable Agentic AI with the safeguard of human oversight, organizations can deliver both the agility and the safety net necessary for users. This dual approach ensures that individuals navigating increasingly automated environments are supported, empowered, and protected against the shortcomings of fully automated solutions. Learn more about how CNXN Helix Center for Applied AI and Robotics helps deliver consistent, high-quality outcomes while remaining resilient in the face of uncertainty and complexity.

]]>
Cost Containment in Healthcare IT: A... https://community.connection.com/cost-containment-in-healthcare-it-a-practical-look-at-private-cloud/ Jan 07, 2026 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2026/01/3403648Connected_Cloud-Cost-Containment-BLOG.jpg

When most of us think about healthcare, we think about ourselves or someone we love. Healthcare is deeply personal. Yet this year I watched wave after wave of healthcare AI launch with seemingly bottomless venture capital behind it, while the hospitals caring for those people we love are fighting to keep their doors open.

That disconnect is exactly why we need to talk about cost containment. This blog is the first in a year-long series on cost recovery in healthcare IT. My goal is to examine real-world ways hospitals can reduce spend across their existing technology stack without compromising the Quintuple Aim.

This first installment focuses on Dell Private Cloud Infrastructure and what it means for organizations trying to reduce infrastructure and licensing spend while maintaining patient care, innovation, and clinician support. I will share what I’m hearing from executives, the financial realities they face, and how an infrastructure approach like Dell’s can shift the conversation from “What do we cut?” to “Where can we recover value?”

Because ultimately, we need to get back to where healthcare really starts: our own health and the people we care about most.

Re-centering Healthcare: The Quintuple Aim

Today's Quintuple Aim evolved from the Institute for Healthcare Improvement's 2008 “Triple Aim” framework that many of us know well. It reflects five priorities for modern healthcare: improve population health, deliver a better experience for patients, lower the overall cost of care, support the well-being of clinicians, and advance health equity.

On paper, it is straightforward. In practice, achieving all five of these goals is far more challenging in today’s environment of rising costs, growing technical debt, and constant pressure to do more with less.

To put that challenge in perspective, U.S. healthcare spending was projected to reach $5.6 trillion in 2025, about 18% of GDP. At the same time, Americans are living shorter and sicker lives than people in other high-income countries. That is not the story of a system delivering better outcomes. It is the story of a system becoming more expensive while still struggling to give patients and clinicians what they need most.

I’m hearing this reality directly from healthcare leaders. In a focus group I led last month, I asked eight CIOs if their jobs were harder now than they were during the pandemic. Every single one said yes. These executives readily shared that the cost of healthcare IT is now encroaching on budgets that directly impact patient care. These budgets were meant to deliver better outcomes for more people without contributing to clinician burnout. Even when making the argument that great technology is great patient care, it's becoming harder to lower costs while maintaining that promise.

Where Healthcare IT Fits into the Cost Crisis

There’s widespread agreement that technology plays a vital role in our individual health outcome. But there is a widening gap in our understanding of how and who should bear the cost burden.

The healthcare IT industry takes the Quintuple Aim principles and applies them to technology solutions that build the digital health framework. Climbing SaaS costs across the full IT stack fulfilled the promise of the cloud. However, subsequent subscription iterations led to technical debt. Post-pandemic, healthcare organizations are feeling the financial pressure from every angle:

  • 37% of hospitals are losing money.
  • Inflation and a tariff-impacted supply chain continue to strain budgets.
  • Reimbursements are stagnating.
  • IT budgets are hovering around 3% or less of overall hospital spend.

And as I write this, we are more than a month into a government shutdown centered on the future of Medicaid, another reminder of just how unstable the broader environment around healthcare financing really is.

How Dell Supports the Quintuple Aim Through Cost Recovery

Dell Private Cloud Infrastructure uses familiar building blocks like PowerEdge compute, PowerStore storage, and the Dell Automation Platform to help hospitals regain control of their environment and support patient care more efficiently. Below are a few of the key ways that shows up in practice.

1. Lower Total Cost and Efficient Resource Use

For many organizations, there is no mystery about where the money is going: too many boxes, too many cores, and too many licenses. By consolidating infrastructure, Dell Private Cloud reduces both the physical footprint and the number of VMware or Broadcom licensed cores you have to support.

Better utilization and built-in data reduction mean you are not paying to power, cool, and maintain resources you do not really need, which drives down total cost of ownership in a way that is very tangible to a CFO.

2. Freedom from Hypervisor and Vendor Lock-in

Healthcare IT leaders need options, not one more form of lock in. Dell Private Cloud supports VMware, Red Hat, Hyper-V, Nutanix, and containers so you can make decisions based on clinical and business needs, not just licensing changes.

Because compute and storage scale independently, you can grow where you need to grow without triggering yet another rip and replace cycle.

3. Modernized, Automated Operations

Most hospitals are running incredibly lean IT teams. The day-to-day work cannot rely on heroics anymore. With a single console through Dell Automation Platform, provisioning, lifecycle management, and updates become more predictable and far less manual.

That means fewer late night maintenance windows, fewer surprises during upgrades, and more time back for the projects that actually move the needle for clinicians and patients.

4. Ready for AI, Hybrid, and Multicloud Workloads

Whether your organization is piloting AI or machine learning at the edge, expanding containerized workloads, or navigating a hybrid or multicloud strategy, the underlying infrastructure has to keep up. Dell Private Cloud is built with those realities in mind, supporting performance hungry workloads while improving data reduction, observability, and operational efficiency.

In other words, you are not building yesterday’s data center and hoping it will handle tomorrow’s use cases.

5. Protecting Your Existing Investments

Very few healthcare organizations have the luxury of starting from scratch. Dell Private Cloud allows you to phase in existing PowerEdge servers and PowerStore storage instead of throwing away sunk costs.

For teams that prefer a consumption-based approach, Dell Apex and Dell Financial Services provide flexible models so you can modernize at a pace that aligns with your priorities.

Dell Private Cloud Benefits

  • PowerEdge Compute: Optimized for high-core density, reducing physical footprint and energy costs
  • PowerStore Storage: Up to 30% faster workloads, 54% lower energy costs, and 5:1 data reduction guarantee
  • Dell Automation Platform: Zero-touch onboarding, automated provisioning, and lifecycle management reduce operational overhead by 30–50%
  • Healthcare-specific compliance: HIPAA, HITRUST, SOC2 built-in

Moving Forward with the Right Partner

Behind solutions like these are the people who help bring them to life. Connection has 17 team members dedicated exclusively to Dell and more than 225 individuals certified across technical and pre-sales tracks. Our Titanium Partner status and broad authorizations across Converged Infrastructure, Data Protection, Networking, Server, and Storage support more than 800 healthcare clients nationwide.

Every healthcare organization’s environment and priorities are unique. If you would like to explore a roadmap tailored to your organization, reach out to your Connection Account Executive or engage our Healthcare Practice to schedule time with our team.

*This is non-sponsored content.

]]>
Winning Retailers Focus on Productivity and... https://community.connection.com/winning-retailers-focus-on-productivity-and-efficiency/ Dec 30, 2025 Brian Gallagher https://community.connection.com/author/brian-gallagher/ https://community.connection.com/wp-content/uploads/2025/12/3365071-Retail-Efficiency-BLOG.jpg

Retailers are prioritizing employee productivity and operational efficiency to adapt to economic uncertainty and evolving consumer behaviors. This is a trend that will only expand as new technologies make it easier to improve productivity every single day. There is a constant drive to do more with less and be smarter at the same time. Economic pressures shape both hiring strategies and technology adoption.

As an example, you might have noticed that holiday staffing levels in 2025 were at their lowest in over a decade. Retailers added fewer seasonal workers in 2025 than at any time since at least 2009, with projections between 265,000 and 365,000 seasonal hires nationwide—a stark drop from previous years.1 The economic environment pushed retailers to extract maximum value from each employee and leverage flexible, scalable solutions.​

Economic Factors Shaping Staffing and Technology Choices

  • Consumer Spending Slowdown: Economic headwinds, including a GDP contraction and increased unemployment concerns, have caused both retailers and consumers to become more cautious, curbing discretionary spending and making efficient operations even more critical.​
  • Labor Market Tightness: Even though retail employment is near multi-year highs, seasonal hiring is lagging, partly because the number of job seekers is catching up with job openings—a trend that typically leads to slower wage growth and increased focus on efficiency.​
  • Cost Pressures: Tariffs, supply chain issues, and inflation continue to drive up operational costs, making labor optimization and technology-driven productivity essential to maintaining profitability.​

Technology Solutions for Productivity and Efficiency

To meet these challenges, retailers are leaning into several technological solutions:

  • Conversational AI and Chatbots: Virtual assistants provide employee support for HR and operational questions, facilitate real-time problem-solving, and streamline training. The ability to provide enterprise specific data accurately, in the right place, at the right time provides a competitive advantage.
  • Data Analytics for Performance Monitoring: Real-time analytics monitor employee productivity and produce actionable insights for continuous improvement, aiding in both cost reduction and better decision-making.​
  • Computer Vision Solutions: Computer Vision AI for retail offers one of the most robust fields of AI today with proven ROI by leveraging existing camera systems to impact the entire business.
  • AI-powered Scheduling and Forecasting: Artificial intelligence and analytics-driven software are used to predict demand, automate shift scheduling, and align staffing with real-time needs, helping minimize both over- and understaffing.​

Retailers Are Not Alone

Retailers big and small are aiming to maintain competitiveness advantages by rapidly adopting AI, automation, and advanced analytics to drive employee productivity and operational efficiency. Employee and operational efficiencies can be found through a number of solutions. It is critical that each decision is made with a long-term goal in mind and trusted partners to lean on.

Sources:

1 https://www.cbsnews.com/news/employment-retail-seasonal-jobs-hiring-lowest-in-15-years/
]]>
2026 Healthcare IT Trends: Priorities When... https://community.connection.com/2026-healthcare-it-trends-priorities-when-everything-is-urgent/ Dec 19, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/12/3419898-2026-Healthcare-IT-Trends-BLOG.jpg

I just finished reading my 2025 IT Trends blog,  published in December 2024 and apart from the general sense of cringe I get anytime I re-read something I wrote, I’m struck by how prescient some of my observations were: President Trump, in fact, signed more than 215 Executive Orders, three of those part of the administration’s larger AI Action Plan; Connection saw an increase in healthcare organizations adopting virtual desktops at point of care; and large enterprise healthcare organizations worked with our team to consolidate their application stack, renegotiate cloud agreements and—gasp—move select workloads on-premises. What I could not have predicted was that healthcare would be at the center of a 43-day government shut down or that more than a year later, we’ve kicked the can on telehealth use and reimbursement waivers to the end of January 2026.

As we greet the new year, I expect healthcare IT professionals to play an integral role in the success of clinical and non-clinical hospital priorities. It’s not going to be easy, but the healthcare IT community never disappoints!

Financial Pressures

Patient care holds a sense of urgency whether the care setting is acute or ambulatory, and 2026 patient care will be shaped by a savvy community of healthcare CIOs who have the agility to manage competing priorities, and they’ll be doing it under tremendous financial pressures. A 2014 NIH study examined the correlation between hospital finances and the quality and safety of patient care and found that, while operating margin by itself is a poor prediction of quality care, financially stable hospitals are better able to maintain the systems that lead to better patient outcomes.

Cost containment, cost avoidance, and good old-fashioned ROI will be central to budget allocation for projects both large AND small. I expect to see more hospital projects—things that would have easily been approved 12 months ago—to undergo a formal bid or RFP process. The C-suite wants assurance that every new solution implemented has clear and measurable return on investment and is deployed with cost consciousness.

AI: It’s Working!

According to a survey from the Commonwealth Fund, 43% of U.S. primary care providers describe feeling burned out, citing administrative burden as a chief contributor. Another, unrelated survey with a much smaller sampling concluded that use of ambient AI significantly reduced healthcare provider burnout by lessening documentation burdens.  Ambient AI was a natural starting point for clinicians, and that integration back to the EMR is essential. That patient-facing caregivers, including nurses and allied health professionals, may still be experiencing burnout suggests that there are more areas where applied AI can increase provider productivity and return joy to the provider.

The arrest of Cashmir Chinedu Luke, owner of Four Corners Health, on suspicion of $7M in fraud perpetrated against the Department of Veterans Affairs is a reminder of how susceptible healthcare organizations are to fraud. AI excels at using pattern recognition to identify anomalies, and as this case makes its way through the courts, I expect healthcare payors and provider organizations will turn to these solutions to maintain organizational integrity. Even if financial recovery isn’t possible once fraud has occurred, creating fraud-resistant systems is a significant preventative measure.

Hospital Revenue Cycle teams traditionally require manual work, done repetitively, limited to business hours, in a seemingly endless loop of submitting and resubmitting claims to CMS and insurance companies for the appropriate reimbursements. As more hospitals turn to AI for tech-first revenue cycles, they’re seeing the benefit of 24x7x365 claims submission with fewer denials and a shorter resolution to the claim. Shortening this cycle leads to a healthier cash flow for the provider. These solutions are scalable and right sized across the different points of care, and I predict that we’ll see huge growth in AI project implementations that deliver bottom-line financial impact.

Rural Health

The One Big Beautiful Bill provides a $50B Rural Health Transformation Program aimed at modernizing care delivery in rural areas beleaguered by low patient census, a hard-to-recruit for workforce and heritage infrastructure that can’t support patient care. This funding is in response to the $1T cut to Medicaid over the next 10 years. While the funding is strong, this isn’t easy money, and it probably doesn’t fill the void left from Medicaid cuts accessed by low-income populations served by rural health. Plus, rural healthcare needs are not well understood. The IT OEM-partner community has struggled to meet the essential needs of rural areas lacking both healthcare and IT professionals. There was over $11B in capital that funded AI healthcare IT startups in 2025, and although those organizations will struggle to support this important care setting, the desire to support the promise of equitable access to high-quality patient care will prevail.

Healthcare IT Solutions at Connection

January marks my 16th year with Connection and approaching my third year as the Director of Healthcare. We’ve expanded our team, adding additional resources to support our GPO Contracts. We’ve added more pre-sales healthcare technical expertise, and aligned our solutions to cost containment, provider experience, patient experience and quality outcomes. We built this for you, our clients and our partners, to help you face the challenges ahead. Learn more about our Healthcare Solutions and Services and contact your Account Team to discuss how we can best support you in 2026 and beyond!  

]]>
Revolutionizing Manufacturing IT Delivery:... https://community.connection.com/revolutionizing-manufacturing-it-delivery-how-connection-accelerates-efficiency-and-innovation/ Dec 16, 2025 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2025/12/3392571-GTM-DW-Topic-3-Manufacturing-TIDC-BLOG.jpg

Manufacturing organizations are under more pressure than ever to deliver operational excellence, manage costs, and drive growth—all while navigating a rapidly evolving technology landscape. From supply chain disruptions and workforce shortages to the need for faster, smarter IT deployments, the challenges are real and growing. That’s where Connection’s Technology Integration and Distribution Center (TIDC) steps in as a true partner, empowering manufacturers to modernize, optimize, and accelerate their IT service delivery.

Why Manufacturers Need a New IT Delivery Model

Today’s manufacturing IT teams face a perfect storm of challenges:

  • Limited staffing and skill shortages
  • Long lead times and supply chain disruptions
  • High turnover and distributed workforces
  • The need to support multiple platforms and remote locations

Traditional approaches to IT provisioning—such as piecemeal shipments, on-site staging, and manual configuration—simply can’t keep up with the pace of business. Manufacturers need a partner who can deliver end-to-end solutions that are fast, scalable, and reliable.

The TIDC Advantage

Connection’s TIDC is designed to be that partner. As Steven Crowthers, Vice President, Technology Integration and Distribution Center, explains, “The TIDC offers a full suite of services, from integration and logistics to final mile delivery. Our mission is to enhance the client experience and deliver a value proposition that benefits the entire manufacturing ecosystem.”

What does that look like in practice? Here are just a few ways TIDC is transforming IT delivery for manufacturers:

  • Unbox and Go: Gone are the days of messy, multi-vendor shipments and on-site chaos. With Connection’s TIDC, equipment arrives pre-provisioned, tested, and ready to deploy. Whether it’s for a factory, warehouse, or remote worker, our solution streamlines integration, reduces packaging waste, and accelerates speed to value.
  • Manufacturing in a Box: Every manufacturer is unique. Connection’s TIDC offers customizable solutions for every scenario, from new facility launches to remote onboarding. Highly trained engineers handle everything from design and discovery to packaging and delivery, ensuring a seamless end-user experience.
  • Zero-touch Provisioning: With advanced imaging, tagging, and configuration, Connection’s TIDC enables plug-and-play deployments at scale. Facilities can be stood up in weeks, not months, with GPS-tracked shipments and precise delivery windows that keep projects on track.

Real-world Results

Greg Geeve, Enterprise Account Executive, Connection, describes the impact: “We work hand-in-hand with the TIDC from the earliest stages of a project, even before a building is constructed. Our teams collaborate to ensure every device, from endpoints to network switches, is delivered, configured, and ready to go. The result? We’ve reduced labor by 35% and cut installation times dramatically, creating real cost savings and growth opportunities for our customers.”

This partnership model isn’t about taking control away from IT teams. It’s about empowering them to focus on strategic initiatives while the TIDC handles the heavy lifting. Customers maintain oversight and involvement, but benefit from streamlined processes, reduced risk, and fewer surprises.

Efficiency, Savings, and Strategic Value

Outsourcing IT provisioning to Connection’s TIDC isn’t just about convenience. It’s a smart economic decision. As Steven notes, “We save clients money by reducing vendor management, increasing deployment speed, minimizing production disruptions, and lowering packaging costs. It’s a cost-effective, efficient way to accelerate modernization and automation projects.”

Greg adds, “Our services let customers invest in better technology instead of extra labor and infrastructure. With secure processes and precise documentation, we deliver innovation and real savings that are proven in every quarterly business review.”

Ready to Modernize Your Manufacturing IT Delivery?

The Connection TIDC is more than a service provider. It’s a strategic partner dedicated to helping manufacturers thrive in a complex, fast-moving world. Whether you’re launching a new facility, upgrading your workforce technology, or looking to streamline IT delivery, the TIDC has the expertise, solutions, and commitment to help you succeed. Want to learn more? Connect with your Account Team or engage our Manufacturing Practice to speak with a specialist.

]]>
A New Way to Work: Meet Copilot Chat in... https://community.connection.com/a-new-way-to-work-meet-copilot-chat-in-microsoft-365/ Dec 11, 2025 Christy Burton https://community.connection.com/author/christy-burton/ https://community.connection.com/wp-content/uploads/2025/12/3362871-Microsoft-Copilot-Chat-BLOG.jpg

Every organization is looking for smarter ways to work—how to communicate clearly, reduce manual tasks, and focus on what really matters. Copilot Chat in Microsoft 365 brings that possibility to life. Acting as your built-in AI assistant, Copilot Chat helps you summarize, brainstorm, and execute tasks more efficiently across the apps you already use every day.

Whether you’re drafting an email in Outlook, analyzing numbers in Excel, or recapping a meeting in Teams, Copilot Chat is there to lighten the load. For business leaders and IT decision-makers, it’s a secure, enterprise-ready step into the world of AI-assisted productivity.

What Is Copilot Chat and How Does It Work?

Think of Copilot Chat as a digital coworker that can understand and respond to everyday language. Instead of needing commands or code, you can simply type or speak a command like, “Summarize this document,” or “Create a report using these notes.”

Behind the scenes, Copilot Chat draws from the information you already have in your organization—emails, documents, chats, and calendars—while respecting the same permissions and security policies your Microsoft 365 environment uses. That means it can only access the files you’re authorized to see.

Copilot Chat doesn’t use your company’s information to train its underlying AI models, and conversations stay within your secure Microsoft 365 tenant. The result is a trusted, Microsoft AI chat bot that helps you make sense of your workday without risking privacy or compliance.

What Can an AI Assistant Do in Everyday Work?

Copilot Chat is more than a chatbot; it’s a capabilities-driven AI assistant designed for real-world tasks. With every interaction, you’re freeing up more time for creativity, decision-making, and strategy.

It can help you:

  • Summarize and condense information. Turn long emails, threads, or meeting transcripts into concise takeaways with clear action items.
  • Draft and rewrite content. From emails and proposals to reports and newsletters, Copilot Chat can suggest outlines, adjust tone, or polish your writing.
  • Analyze and visualize data. Ask questions about your spreadsheets and watch it generate tables, charts, and formulas instantly.
  • Create presentations. Transform Word documents or meeting notes into ready-to-edit slides with speaker prompts and summaries.
  • Answer questions. Need to locate a specific file or recall key project details? Just ask. Copilot Chat retrieves answers from content you already have permission to access.

Why Microsoft Copilot Inside Microsoft 365?

Unlike many AI chat tools that require switching between apps or sharing external data, Copilot Chat works where you already work. It’s embedded across Word, Excel, PowerPoint, Teams, and Outlook.

Because it operates inside Microsoft 365, Copilot Chat follows your organization’s security, compliance, and identity controls. Plus, Microsoft offers guided learning paths and weekly challenges that help users gain confidence in prompt writing and AI-assisted collaboration.

Do You Need a License for Copilot Chat?

Yes. Accessing Copilot Chat requires signing in with a work account and having the proper Microsoft 365 Copilot license. Administrators control which users and departments can enable the service, ensuring consistent governance and data management.

For organizations exploring Copilot Chat for the first time, this is an ideal moment to start a pilot program—and Connection can help. New customers may be eligible for 15% offprofessional services, making it easier to plan, configure, and deploy Copilot Chat securely within your Microsoft 365 environment.

Organizations can find detailed information on access, licensing, and setup in Microsoft’s Learn Copilot Chat resources—an excellent starting point for IT leaders introducing AI assistance company-wide.

How to Log in and Get Started

Getting started is simple:

  1. Sign in to your Microsoft 365 account using your work credentials.
  2. Verify that your organization’s administrator has enabled Microsoft 365 Copilot services.
  3. Open your preferred app—Word, Outlook, Excel, or Teams—and look for the Copilot Chat icon.
  4. Try requesting something like “Summarize this email thread” or “List next steps from our last meeting.”

Within moments, Copilot Chat begins responding in context, helping you explore how AI-assisted productivity fits into your daily routine.

What Are the Capabilities of an AI Assistant Like Copilot Chat?

Copilot Chat’s strength lies in its range of AI-powered capabilities, which work together to simplify tasks and boost focus:

  • Drafting and rewriting: Create polished emails, proposals, and reports in less time.
  • Summarizing and extracting: Highlight the most important points from meetings, chats, or documents.
  • Data analysis: Ask questions about spreadsheets and visualize insights through charts and tables.
  • Presentation creation: Generate PowerPoint slides and speaker notes directly from Word or Teams content.

You can even use Copilot Chat to learn how to use it better. The more you explore, the smarter your prompts become. Ask it for prompt guidance, such as tips on how to phrase questions or requests more effectively, to unlock stronger, more accurate results. Over time, it becomes a learning cycle that helps employees build skill and confidence using AI across their daily workflows.

Is Copilot Chat Like ChatGPT?

Copilot Chat and ChatGPT share similar foundations in large language model technology, but they’re not the same. Copilot Chat is integrated directly into Microsoft 365, meaning it’s governed by enterprise-grade privacy and security policies.

Cornell University explains that Copilot operates within Cornell’s Microsoft 365 environment and adheres to data protection and governance policies, reinforcing Microsoft’s commitment to privacy and compliance, providing a secure space to explore the benefits of AI in everyday work. You can think of it as ChatGPT’s professional counterpart, built specifically for secure business use within Microsoft tools.

Common Questions About Copilot Chat

What is Copilot Chat in Microsoft 365?
It’s an AI-powered assistant built into Microsoft 365 that helps you summarize, draft, and analyze information across your organization’s apps and data.

How does AI-assisted chat improve productivity?
By reducing time spent on repetitive tasks—like summarizing meetings, writing reports, or finding key details—so teams can focus on higher-value work.

Do I need a license for the Copilot app?
Yes. You must sign in with your work account and have the appropriate Microsoft 365 Copilot license enabled by your admin.

Where do I sign in for Microsoft Copilot?
Go to your Microsoft 365 dashboard and sign in with your work credentials. Once enabled, you’ll see the Copilot icon in supported apps.

What can a copilot do with my files?
Copilot Chat only works with the content you have access to and operates within your organization’s security settings.

How good is Copilot for proposals, meetings, and reports?
It’s particularly strong in those areas. You can generate outlines, summaries, and next steps instantly, and save hours of manual editing.

Fun fact: As many organizations test these capabilities, eligible new customers can take advantage of a 15% discount on Copilot licenses, minimum purchase of 10 (max. 2,500 seats). This is a great opportunity to explore Copilot Chat implementation and AI adoption strategies with expert guidance.

Bringing AI Assistance into Daily Work

Copilot Chat brings together the power of AI and the familiarity of the tools employees already use. It helps transform repetitive, time-consuming work into actionable insights and polished results, without leaving the Microsoft ecosystem.

By starting small, like trying a few repeatable tasks and tracking time saved, organizations can quickly measure impact. And with Connection’s expertise in AI enablement, deployment, and security, your team can move forward confidently, knowing that the technology truly works for you.

]]>
Beyond Procurement: A Holistic Approach to... https://community.connection.com/beyond-procurement-a-holistic-approach-to-healthcare-it-lifecycle-management/ Dec 04, 2025 Kelly Kempf https://community.connection.com/author/kelly-kempf/ https://community.connection.com/wp-content/uploads/2025/12/3354721-GTM-DW-Topic-3-Healthcare-IT-Lifecycle-Management-BLOG.jpg

I recently visited one of my favorite big box furniture stores. I walked the vignettes, considering which design I thought might be a good fit. I noted my favorite items and prices and briefly chatted with my partner via text. Then I weighed logistics: project budget, haul or delivery, timing of such, box weights, assembly time, and help required (no, my seven-year-old doesn’t count). I also wondered how to price and sell our old furniture to avoid extra clutter. Needless to say, I left with a future execution plan and several pictures for ideation, but with no furniture in my cart. 

Similarly, in business, we often generate strong ideas but struggle with execution due to overlooked logistics like stock availability, delivery and setup costs, depot services, employee training or additional infrastructure needs. Much like a trip to a furniture store that stalls before it starts, these oversights can and will hinder progress. Although we note mistakes and adjust our actions, in the world of healthcare, delays in technology deployment impact more than just time and budget. When patient outcomes are at risk, every moment and every penny invested matters. 

A Holistic Lifecycle Management Approach

Traditional IT lifecycle management (ITLM) spans from asset procurement to disposal. Most resellers assist only at the start and end of this process. Connection takes a more comprehensive approach, by working closely with stakeholders to understand business challenges, ensuring our solutions are properly designed making the biggest impact on your financial, sustainability and financial goals. Our services cover not only asset acquisition, but preparation, integration, deployment, ongoing management, and end-of-life strategies.

Overcoming Logistic Nightmares in Supply Chain and IT Lifecycle Management

Over the years, Connection has developed systems designed to remove obstacles in delivering and implementing technology for healthcare. Whether you are looking for a plug-and-play replacement for current workstations on wheels or plan to open several new clinics in the next year, our solutions can make life easier for project management, procurement, IT, and the clinical staff. Imagine receiving 10 new medical carts, fully assembled and shrink-wrapped with all the peripherals in place: monitor, thin client, keyboard, mouse, scanners, printer, camera, and more—mounted, connected, imaged, and tested—so that you can roll them right off the delivery truck for the IT team to approve for deployment into production.  Magical, right? But it’s a reality that can be executed. Let’s explore additional ways Connection can optimize your end-user experience with custom configuration services, convenient deployment options, and advanced inventory planning and rollout management—all backed by exceptional customer experience and support. 

What Does Sustainability and IT Lifecycle Management Success Look Like?

According to Gartner, enterprise technology contributes nearly 400 megatons of CO₂ emissions, yet fewer than 30% of organizations implement sustainable IT practices.1 To address this, IT lifecycle management is critical for reducing environmental impact and optimizing costs.2

There are two common use cases for ITLM that can significantly advance sustainability efforts while also simplifying complex projects:

  • Relocations and Consolidations—Accurate asset inventory can help you know where to relocate assets as necessary and plan for device end-of-life, including both disposal and resale options.
  • Large Deployments—Consolidation and engineering of packing and transportation increase efficiencies and reduce shipping costs and waste.

At Connection, our approach includes end-to-end lifecycle services—from procurement and deployment to asset disposition and recycling. We accomplish this through programs like ConnectONE™ for device refresh, OS upgrades, supply chain optimization, and zero-touch provisioning. We measure our success through KPIs such as waste diversion,carbon footprint reduction, refurbishment rates, and cost savings, ensuring compliance and sustainability goals are met. 

IT Outsourcing for the Win

A strong IT outsourcing (ITO) relationship drives trust, stability, and innovation beyond what low-cost contracts offer. When done well, ITO can play a big role in transforming your IT architecture and business. Global spending on ITO is expected to reach $2 trillion by 2028, driven by demand for cost efficiencies and digital expertise.3 Forrester, Deloitte, and Gartner all say strategic ITO gives access to expertise, efficiency, and time for innovation.

78% of businesses report positive experiences with IT outsourcing, citing improved service quality, cost savings, and strategic value.4 According to the IDC, managed device services offer a 65% lower cost of deployment of new devices and applications.5 This is the opportunity of ITO. Done correctly, it can drive long-term scalability, resilience, and simplicity in IT operations.

Proof Is in the Numbers

Connection adheres to the motto, “Change Happens. Expertise Wins,” with our metrics and investments reflecting a firm commitment to quality and excellence. Don’t just take our word for it—our numbers show it. 

Net Promoter Score: Connection’s current Net Promoter Score stands at an impressive 79, with exceptional service levels identified as the leading factor in client retention.

Service Delivery:

  • Inventory accuracy is maintained at 99.999% across over $1 billion in receipts.
  • On-time order fulfillment and carrier performance:
    • Priority, advance exchange, next day, scheduled rollout, and e-commerce orders (under 15 units) consistently meet same-day SLA above 98.5%.
    • Carrier on-time delivery rate is 98.5%; next day delivery service achieves 99.2%.
    • Comprehensive last mile and white glove services are available.

Upgrades and Expansions: Our facilities encompass 267,000 square feet, supported by 50Gb fiber, VPN connectivity, over 2,000 live connections, and customizable add-on services.

Security and Compliance: Connection maintains certifications for safety and compliance including SOC2 Type 2, ISO 9001:2015, ISO/IEC 20000-1:2018, ISO 22301:2019, and ISO 27001:2013 standards.

Secure Endpoints: Secure infrastructure components with best of breed solutions to ensure HIPAA compliance and protect against targeted cyber attacks.

Dedication to Sustainability: Packaging reduction and recycling initiatives contribute to lower freight costs and diminished fuel consumption, supporting environmentally responsible operations.

Partner with Connection

With over 40 years in the channel and 35 years’ experience in technology integration and distribution, we understand the impacts of supply chain on advancing your technological innovation. Our goal is to provide ongoing value that supports business outcomes, not just meet SLAs. For centralized management, visibility, coordination, scalability, customization, increased optimization and sustainability, and excellent client service, Connection is your ideal partner. For additional information on how you can contain costs, boost efficiency, and unburden your IT teams through optimized supply chain an IT lifecycle management, contact your Account Team or visit our Supply Chain and Lifecycle Webpage to learn more.

Additional IT Lifecycle Management Resources: 

References:

  1. McKinsey Digital, 2022, The green IT revolution: A blueprint for CIOs to combat climate change
  2. Gartner, 2024, Gartner Says Most Cost-effective Sustainable IT Initiatives Are Underutilized
  3. Michael O’Grady, “Global IT Services Spend Will Reach $2 Trillion By 2028,” Forrester, May 1, 2024, https://www.forrester.com/blogs/global-it-service-spend-will-reach-2-trillion-by-2028/
  4. FullScale, 2024, Complete Guide to IT Outsourcing: Models, Costs, and Strategy (2025) https://fullscale.io/blog/complete-guide-to-it-outsourcing
  5. IDC, 2024, The Benefits of Integrating Managed Device Services

]]>
Mastering AI with Microsoft 365 Copilot: Key... https://community.connection.com/mastering-ai-with-microsoft-365-copilot-key-insights-and-real-world-applications/ Dec 02, 2025 Christy Burton https://community.connection.com/author/christy-burton/ https://community.connection.com/wp-content/uploads/2025/11/3340721-Blog-Post-Mastering-AI-with-Microsoft-365-Copilot-BLOG.jpg

In our recent exploration of mastering AI using Microsoft 365 Copilot, we uncovered several valuable insights and practical applications that can significantly enhance productivity and efficiency in the workplace.

Advancing with Microsoft 365 Copilot

One of the key takeaways is the seamless integration of Microsoft 365 Copilot into various applications like Teams, Outlook, Word, and Excel. This integration makes Copilot an indispensable tool for daily tasks. As Rob Gates, Principal Cloud Solution Architect at Connection, mentioned, “Copilot has been integrated into the flow of work, providing answers and generating content right where you need it.”

Real-world Use Cases

  1. Content Generation and Research: “Copilot can create blog posts, analyze financial data, and generate presentations based on technical specifications,” said Rob Gates. For example, a marketing team used Copilot to draft a comprehensive blog post about their new product launch. By analyzing the product’s technical specifications, Copilot generated a detailed and engaging article that was ready for publication.
  2. Security and Compliance: A financial services firm leveraged Copilot to handle sensitive client data. The firm was reassured by Copilot’s compliance with data protection policies, which allowed them to use AI tools confidently without compromising security. “Copilot adheres to enterprise data protection policies, ensuring that user data is secure and not used to train the models,” emphasized Woody Walton, Director of Partner Technology at Connection.
  3. Enhanced Collaboration: As Rob Gates explained, “We can elevate this to a shareable component that I can then work with some folks on.” For example, a project management team used Copilot to collaborate on a project plan. By sharing the generated content in a Loop page, team members could easily edit and contribute, streamlining the collaboration process.
  4. Data Analysis and Visualization: An analytics team used Copilot to analyze a large dataset of sales figures. Copilot generated detailed charts and graphs, making it easier for the team to identify trends and make data-driven decisions. “Copilot can create charts and graphs, providing visual insights into data,” as Rob Gates noted.

The Future of AI in the Workplace

The rapid advancements in AI technology are transforming the way we work. Tools like Microsoft 365 Copilot are at the forefront of this transformation, enabling organizations to enhance productivity, ensure data security, and drive innovation. As Woody Walton highlighted, “It’s exciting to see how these tools are being weaved into the flow of work, making AI an integral part of our daily tasks.”

By integrating AI into everyday tasks, organizations can unlock new levels of efficiency and innovation. We look forward to seeing how you will implement these learnings in your work!

]]>
Securing the Future: Key Insights on... https://community.connection.com/securing-the-future-key-insights-on-ais-dual-role-in-cybersecurity/ Nov 25, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/11/3347521-Securing-the-Future-BLOG.jpg

We recently hosted a webinar with Jamal Khan, Chief Growth and Innovation Officer at Connection and Head of the CNXN Helix™ Center for Applied AI and Robotics—and guest speaker Allie Mellen, Principal Analyst at Forrester—to explore one of the most pressing questions in cybersecurity today: How is artificial intelligence reshaping both our defenses and the threats we face?

Both speakers bring deep expertise in the intersection of security operations, AI innovation, and risk management. Jamal leads Connection’s efforts to apply AI and robotics responsibly across industries, while Allie has spent years researching how AI and automation are transforming the Security Operations Center (SOC) and the broader cybersecurity ecosystem.

In recognition of National Cybersecurity Awareness Month, the Microsoft-sponsored webinar cut through the hype to reveal where AI is delivering real value—and where it still demands caution.

Separating Hype from Reality: Understanding AI’s Limitations

One of the first myths our speakers addressed was the idea that AI will replace the Security Operations Center (SOC).

Jamal noted that early predictions about fully automated SOCs “hit their reality factor.” He said that while automation and AI are valuable tools, the notion of replacing the SOC is misguided. “It still requires a significant amount of human involvement,” he explained.

Allie agreed, describing how her perspective has evolved. “To be honest, I think the past three years or so have been kind of a disappointment,” she said. “You can use generative AI for things like researching threat actors—but nothing really compelling. That’s changing now, particularly because of AI agents.”

Still, she cautioned that current capabilities don’t justify removing humans from the loop. “Sometimes AI is wrong, and we need a human to really understand what’s going on and make an informed decision.”

The conversation also touched on another misconception that AI can effectively train junior analysts. Allie pointed to a recent study showing that “AI chatbots gave a wrong answer to more than 60% of queries.” She argued that using these tools to upskill staff “is a crazy idea… especially for someone deeply concerned with risk and risk management.”

Where AI Actually Delivers Value

Both speakers agreed that while AI won’t replace the SOC, it can make it far more efficient.

Jamal emphasized AI’s potential to “reduce toil.” This refers to the repetitive, manual tasks that weigh down security teams. He described how generative AI can “help us inform and build better decisions” by summarizing logs, generating case notes, and enriching alerts.

Allie added that she’s now seeing “AI agents that are purpose-built to do things like triage or initial investigation of incidents.” Those functions, she explained, “take up so much time in an analyst’s day, and to be seeing AI be used, and used effectively, for those functions… it’s actually really starting to get exciting.”

She categorized current AI applications in security into three areas:

  1. Content creation, such as report writing and script evaluation
  2. Knowledge articulation, including chatbots for research and intelligence queries
  3. Behavior modeling, where AI creates playbooks, generates parsers, and assists with investigation and triage.

“This,” she said, “is where the real value add is going to be.”

The Double-edged Sword: AI Empowers Attackers Too

The discussion also explored how AI is changing the threat landscape. Jamal noted that while AI supports defenders, it also gives attackers “scale management” and the ability to create highly personalized phishing, deepfakes, and automated reconnaissance.

Allie explained how AI lowers the technical barriers that once limited attackers. “Being able to operate among different types of infrastructure becomes very important,” she said. “That’s changing significantly because of AI.”

She referenced recent research showing that threat actors are already using AI to support ransomware-as-a-service operations, romance scams, and reconnaissance. “There are a lot of really effective things that attackers can use AI for to aid their efforts,” she said, adding that nation-state attackers will likely adopt agentic AI systems first, with cybercriminals following.

Jamal raised an interesting question about whether both attackers and defenders could eventually “over-trust AI.” If adversaries rely too heavily on automated systems, he suggested, “they become noisy and thereby less effective.”

Evaluating AI Security Products: Beyond the Marketing

When asked how to separate substance from marketing, Allie shared Forrester’s framework for evaluating AI in security products: trust, utility, and cost.

Trust: The biggest red flag is vendors that rely on user thumbs-up/thumbs-down feedback, essentially crowdsourcing quality assurance. The only truly reliable method at scale is expert validation, with dedicated teams evaluating outputs before actions are executed. We're moving from deterministic software to non-deterministic systems, requiring new testing paradigms and continuous validation.

Utility: Is this feature genuinely useful for your team's specific workflows, or is it a checkbox item that sounds impressive in vendor presentations?

Cost: Pricing models for AI features remain wildly inconsistent across vendors. Understanding total cost of ownership—including API calls, compute resources, and data processing—is essential before committing.

Organizations also need metrics to validate AI effectiveness. Are mean time to detect and respond (MTDR) metrics improving? Is alert volume decreasing while detection accuracy increases? These change management metrics provide concrete evidence of value.

The New Attack Surface: Securing AI Itself

Allie and Jamal also discussed the security challenges introduced by deploying AI.

Allie outlined three main areas of risk: users, applications, and models. Users face issues like prompt injection and data leakage; applications are vulnerable through vector databases and enterprise data; and models are exposed to inference attacks, tampering, and data poisoning.

She introduced Forrester’s new AEGIS framework (Agentic AI Enterprise Guardrails for Information Security), which focuses on:

  • Least agency, extending Zero Trust to limit what AI agents can do
  • Continuous risk management, replacing one-time assessments with ongoing monitoring; and
  • Explainable outcomes, so teams understand how and why AI systems make decisions.

Jamal noted that many existing risk frameworks struggle to keep pace with AI’s rapid evolution. “A lot of these frameworks, in their static nature, are always two steps behind,” he said.

The CISO Action Plan: A Phased Approach

To conclude the webinar, Jamal offered a pragmatic roadmap for CISOs.

First 30 Days: Conduct comprehensive AI inventory (organizations are often surprised by shadow AI already operating), establish clear policy guidelines, and launch high-value, low-risk pilots.

Next 30 Days: Build a model registry tracking approved models and their lineage, create prompt repositories, implement DLP controls on AI prompts (especially for PII), define evaluation processes, and develop incident response runbooks for AI failures.

Following 30 Days: Integrate AI with SOAR strategy, establish meaningful guardrails beyond basic controls, implement red teaming programs, and continuously improve based on real-world usage.

Allie added that CISOs must act as translators for their organizations. “There’s an expectation from other teams that you’re going to see significant value-add from AI very quickly,” she said. “Security has to explain why that’s not always realistic.” She recommended rolling out AI tools first to senior staff “who have the most understanding of your processes,” before extending access to less experienced analysts.

Final Takeaway

This discussion made one thing clear: AI won't eliminate security expertise. Rather, it will make it indispensable. As the race between AI-wielding attackers and defenders intensifies, the path to success is strategic: adopt AI thoughtfully, maintain healthy skepticism, and keep humans firmly in control. This allows defenders to harness AI's power for acceleration, not autonomy.

Watch the full webinar to hear the complete discussion on guardrails, real-world applications, and the future of agentic systems.

As a Microsoft Solutions Partner for Security—with all four advanced security specializations—Connection helps defend identities, data, and infrastructure from evolving cyberthreats while boosting operational efficiency. Learn how our Microsoft security experts work alongside your team to strengthen your security posture. To view our cybersecurity services, visit www.connection.com/cybersecurity.

]]>
TechSperience Episode 142: Revolutionizing... https://community.connection.com/techsperience-episode-142-revolutionizing-healthcare-with-clinical-mobility/ Nov 21, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/11/3331871-TechSperience-Ep142-Revolutionizing-Healthcare-BLOG.jpg

This conversation explores the transformative impact of clinical mobility and technology in healthcare, focusing on how Zebra Technologies enhances clinical workflows, improves patient and provider safety, and addresses security concerns. The discussion highlights innovations in nursing, the integration of AI, and the importance of measuring ROI in healthcare technology.

The speakers share insights on Zebra’s role across the healthcare ecosystem and future innovations, emphasizing the need for effective tools to support clinicians and improve patient outcomes.

Show Notes

00:00 Introduction to Clinical Mobility in Healthcare

03:03 Transforming Clinical Workflows with Technology

05:43 The Role of AI in Healthcare

08:37 Enhancing Patient and Provider Safety

11:22 Addressing Security in Healthcare Technology

13:58 Zebra's Impact Across Healthcare Ecosystem

17:02 Measuring ROI in Healthcare Technology

19:46 Future Innovations and Acquisitions at Zebra 22:39 Final Thoughts and Closing Remarks

]]>
Partnering GPO with Connection: Why... https://community.connection.com/partnering-gpo-with-connection-why-organizations-should-consider-a-new-procurement-strategy/ Nov 18, 2025 Janet Watts https://community.connection.com/author/janet-watts/ https://community.connection.com/wp-content/uploads/2025/11/3334721-Why-Organizations-Should-Consider-a-New-Procurement-Strategy-BLOG.jpg

In today’s competitive business landscape, healthcare organizations are constantly seeking ways to optimize procurement, reduce costs, and enhance operational efficiency. The most popular strategy today is purchasing through a Group Purchasing Organization (GPO) in partnership with Connection. This powerful combination offers more than just savings—it delivers strategic value.

What Is a GPO?

A GPO aggregates the purchasing power of multiple businesses to negotiate better pricing, terms, and services with suppliers. By joining a GPO, members gain access to pre-negotiated contracts that would be difficult to secure independently. GPOs empower members to consolidate purchasing advantages across a diverse range of supplier categories.

The Value of Connection

Connection is a National Solutions Provider (NSP) who goes beyond simply selling laptops, keyboards, and software. We provide advisory, managed, and professional services—such as consulting, implementation, customization, training, and support—tailoring solutions to meet specific business needs. If your organization is tired of tech headaches—or just wants a smoother way to manage IT needs—Connection is here to help you calm the confusion of IT.

Wondering if partnering a GPO with Connection makes sense? Here are a few things to think about:

Cost Savings without Compromise

GPOs leverage collective buying power to secure competitive pricing. When partnered with Connection, members gain access to certified technical experts and experienced Account Managers who specialize in negotiating with OEMs for exclusive discounts, ensuring members investments align with strategic goals and deliver the maximum ROI.

Making Procurement Easier with Connection

Tech procurement can get messy. Between negotiating RFPs, juggling vendors, and figuring out product specs, it’s overwhelming. This is where a GPO–Connection partnership can make life easier. We cut through the complexity by streamlining the entire process. With standardized contracts and terms already handled by the GPO, members can skip the back-and-forth and get straight to buying—faster and with less hassle. It’s smart and efficient procurement that works.

Expert Guidance the Connection Way

GPOs are great at simplifying where and how to buy, but what about figuring out what to buy and why it matters? That’s where Connection steps in. Think of us as a strategic tech advisor. We help organizations make smart, informed decisions based on their unique goals—whether it’s upgrading IT infrastructure, choosing the right medical equipment, or finding the best-fit software. Our goal is to make sure every purchase supports the bigger picture, not just the member’s budget.

Procurement Doesn’t End with the Purchase Order

Connection provides end-to-end support from installation and configuration to training and troubleshooting. This ensures that solutions are not only acquired efficiently, but are also deployed effectively, minimizing disruption and assuring member investments perform optimally. GPOs reduce procurement risks. Connection adds another layer of assurance by providing expert implementation and post-sale support, minimizing downtime and errors. By putting Connection at the helm with a partnered GPO, members can focus on strategic initiatives rather than administrative tasks. This leads to faster decision-making, improved productivity, and better resource allocation.

Real-world Impact

Needs evolve as organizations grow. GPOs offer scalable contracts, while Connection provides flexible solutions that adapt to a member’s changing requirements—whether they're expanding operations or upgrading technology. Imagine a mid-sized healthcare provider needing new IT infrastructure. By purchasing through a GPO, members have access to discounted pricing on hardware and software. Connection then customizes the setup, integrates it with existing systems, and trains staff. This delivers a seamless, cost-effective solution.

Ready to Implement a New Procurement Strategy?

The synergy between GPOs and Connection creates a procurement model that is cost-effective, efficient, and strategic. It’s not just about buying smarter; it’s about building partnerships and relationships that drive long-term success. Let’s chat about how partnering your GPO with Connection can relieve your tech headaches and make IT procurement a lot easier.

]]>
Federal Funding for the Public Sector: How... https://community.connection.com/federal-funding-for-the-public-sector-how-to-secure-the-resources-you-need-in-uncertain-times/ Nov 07, 2025 Pam Aulakh https://community.connection.com/author/pam-aulakh/ https://community.connection.com/wp-content/uploads/2025/10/3297171-PSG-SLED-Federal-Grants-Blog.jpg

Imagine this: your institution is preparing to roll out a major technology initiative. It’s fully scoped, community impact is clear, and the team is aligned. Then, without warning, the federal funding you were relying on is delayed or withdrawn. Suddenly, you’re left with hard choices: scale back, postpone, or pause entirely.

If that scenario sounds familiar, you’re not alone. In fact, nearly 65 % of college and university CBOs say federal policy uncertainty is impairing even basic financial planning, and at the K–12 level, over $6 billion of federal funds remain unreleased to state systems. Across higher education, K–12, and state and local government, uncertainty around federal funding has become the norm, not the exception. The question many public sector leaders are asking is no longer, “Can we get funding?” but, “How do we keep projects moving when traditional funding sources are in flux?”

The good news: there is a way forward. And it starts with a broader view of what funding really looks like today.

The Reality of Today’s Federal Funding Landscape

Federal funding cycles have never been simple. Shifts in political priorities, economic uncertainty, and evolving national challenges all play a role in how funds are distributed—and when. For colleges, universities, school districts, and municipal governments, this volatility can have a direct and immediate impact.

In higher education especially, the conversation around financial sustainability has intensified. Universities are finding themselves reviewing long-term project roadmaps, searching for more diversified funding models, and rethinking their reliance on traditional federal allocations.

But even as these concerns grow, many public sector organizations overlook a powerful tool already available to them: grants. Beyond the headline-grabbing federal programs, there’s a wide network of state-level grants, private foundation support, corporate initiatives, and hybrid partnerships that continue to fund meaningful public projects every day.

The problem isn’t that funding doesn’t exist. It’s figuring out which opportunities fit your needs, assessing their potential, and moving quickly enough to secure them.

Expanding Your Funding Strategy Beyond Federal Dollars

If your team is still focused only on the next round of federal dollars, you could be missing out on flexible, accessible alternatives. Grants can fund nearly every type of initiative, especially in technology, infrastructure, digital equity, and educational innovation.

At Connection, we help public sector organizations make sense of the broader grant ecosystem by breaking opportunities into three key categories:

  • Tech-focused grants are built for digital transformation. Whether it's upgrading IT infrastructure, improving cybersecurity, or bringing new classroom technologies to life, these grants are aligned with technical outcomes and measurable impact.
  • Flexible funding opportunities may not be written explicitly for tech, but they allow technology to play a role in broader goals. From community development to operational modernization, these funds offer versatility if your project includes digital components.
  • Solution-open grants prioritize results, not methods. They’re not inherently tech-focused, but they’re open to innovative solutions—especially when those solutions support measurable public benefit.

Knowing how to categorize, compare, and prioritize grants makes it easier to align them with your organization’s goals and act quickly when opportunities arise.

The Power of Proactive Grant Intelligence

One of the biggest barriers to securing grant funding isn’t eligibility—it’s actually timing. Many organizations learn about a promising grant opportunity too late to mount a competitive proposal. Others spend time applying for funding that’s poorly aligned with their project goals, leading to frustration and missed chances.

That’s where intelligence comes in.

Connection offers real-time visibility into available and upcoming grants, including how they’re scored, what they fund, and how closely they align with your needs. Our experts don’t just share lists; we also help you understand which opportunities are a good strategic fit, where you can be competitive, and how to start preparing even before RFPs are publicly released. This kind of proactive insight can make the difference between a stalled project and a funded one.

Speaking the Language of Funding

If you’ve ever had a grant proposal rejected, you know that eligibility isn’t enough. Every funder, from federal agencies to private foundations, has their own set of priorities, language, and evaluation criteria. A proposal that hits the mark for one opportunity might completely miss the point for another.

Understanding the “language of funding” is essential. For example, a school district applying for broadband expansion support may find relevant grants from the Department of Education, a state IT initiative, or a corporate digital equity program. But each of those opportunities will ask different questions, require different metrics, and prioritize different community outcomes.

The key is learning how to speak directly to the funder’s goals and framing your proposal not only as a project, but as a partnership that advances their mission as well as your own.

Connection supports organizations in refining messaging, aligning proposals to evaluation criteria, and increasing their chance of approval across all types of funders.

Building a Sustainable Public Sector Funding Strategy

The organizations that secure consistent funding don’t wait for a crisis to explore their options. They build ongoing grant intelligence into their planning cycles. They stay informed, evaluate opportunities before they become urgent, and make funding awareness part of project development from the start.

This approach isn’t just about keeping up; it’s about creating funding resilience.

Imagine being able to identify viable grant options at the concept stage of your next initiative. Instead of scrambling for dollars later, you can shape your strategy with funding requirements in mind from day one. That’s the shift organizations are making, and it’s one we help our customers build into their operational mindset.

Taking the Guesswork Out of Grants with Connection

At Connection, we believe securing grant funding doesn’t have to be daunting. Our public sector customers don’t just get access to lists of available grants—they also get guidance. We walk you through the process, from evaluating your current initiatives to aligning them with specific opportunities and preparing for timely action.

That includes:

  • Reviewing your short- and long-term goals
  • Identifying available and emerging grant opportunities that align
  • Explaining how those grants are scored and prioritized
  • Offering actionable next steps to improve your funding readiness

While Connection doesn’t author or submit grant proposals, our role is to help you approach the process with confidence. We equip your team with the insights, context, and strategy to identify the right opportunities and pursue them effectively.

This isn’t just about checking boxes. It’s about creating a clear, customized strategy to help you fund what matters most, without the delays and guesswork that hold so many projects back.

Funding Clarity Starts Here

The first step? Start the conversation. Find out what’s available. Understand how your goals align with current opportunities. And build a plan that turns uncertainty into action.

Connection works with public sector organizations every day to help them navigate the grant landscape. Our tools and expertise can bring clarity to your funding strategy and help you act before it’s too late. Contact your Connection Public Sector representative to explore grant opportunities that align with your priorities and put your mission back in motion. Don’t wait for the next budget cycle. Let’s secure the funding you need, together.

]]>
From Burnout to Breakthrough: How Clinical... https://community.connection.com/from-burnout-to-breakthrough-how-clinical-mobility-solves-healthcares-staffing-crisis/ Nov 03, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/10/3213021-Zebra-Healthcare-BLOG.jpg

Healthcare workers enter their profession to provide patient care, but they spend an increasing portion of their shifts wrestling with disjointed technology. Clinicians currently juggle pagers, smartphones, badge scanners, and tablets. Each device serves a purpose, yet collectively they create a burden that fragments workflow.

Despite billions invested in healthcare technology, many healthcare workers feel more overwhelmed than empowered. Instead of easing workloads, fragmented systems often add to the burden, compounding an already critical workforce crisis. With staffing shortages deepening and experienced clinicians leaving at alarming rates, healthcare organizations can no longer afford tools that create complexity. The real imperative is to invest in platforms that ease burnout and amplify clinician effectiveness.

The answer lies in integrated clinical mobility platforms that reduce device overload, streamline communication, and keep clinicians connected without keeping them tethered. When technology works seamlessly together, existing teams can work smarter and faster, leading to greater job satisfaction and more time for direct patient care.

The Hidden Cost of Device Overload and Burnout

Spend a few minutes shadowing a healthcare worker and the issue becomes obvious: pockets stuffed with devices—a smartphone, pager, badge scanner, keys—often paired with a heavy workstation on wheels or a tablet. This device overload fuels burnout by creating cognitive strain: constantly checking multiple tools, remembering different functions, and physically juggling equipment while trying to deliver care.

When a clinician’s hands are full of devices, they can’t easily assist a patient with mobility, open doors, or respond naturally in urgent situations. It’s more than an inconvenience; it’s demoralizing for professionals who entered healthcare to heal, not to troubleshoot technology.

Communication gaps only deepen the frustration. Fragmented workflows mean precious minutes slip away hunting for information or colleagues: a physician pages a nurse who must call back; a clinical question requires a pharmacy consult, triggering another trip and another call. Each interruption compounds, draining energy and pulling focus away from what matters most: the patient.

Zebra’s Integrated Platform: Technology That Works Together

Zebra’s approach fundamentally differs from the patchwork of point solutions many hospitals have accumulated. Instead of adding another device, the Clinical Mobility as a Platform creates an integrated ecosystem where communication and documentation work seamlessly together.

The platform combines two key components:

  1. Zebra’s Hands-free Clinical Communications Solution: A wearable, voice-activated device worn like a badge. It enables push-to-talk communication and instant team connectivity without occupying the clinician’s hands, maintaining the mobility healthcare requires.
  2. Zebra Mobile Computing and Tablet Solutions: These hospital-grade devices provide a unified platform for documentation, medication administration, and patient data access. They are disinfectant-ready, drop-resistant, and designed to integrate with existing electronic health records for real-time decision support.

The power is in the integration. By operating on a unified platform, clinicians experience seamless transitions between receiving information, documenting care, and coordinating with the team. Features like single sign-on and unified interfaces mean less time managing technology and more time managing care. Hot-swappable batteries support full-shift operation and uninterrupted device availability, eliminating downtime and ensuring smooth handoffs between shifts. This is critical when nearly 80% of serious medical errors stem from communication failures during shift change.

Hands-free Means More Hands for Patients

The hands-free solution is a fundamental shift in how clinicians stay connected. Instead of reaching for a phone or returning to a station, nurses can communicate instantly while remaining at the bedside.

Consider the real-world impact: A nurse administering medication notices a change in a patient’s vital signs. She can initiate a voice-activated call immediately, describing the situation while maintaining eyes-up contact with the patient. What might have been a 10-minute interruption becomes a 60-second intervention.

During a rapid response, the device allows the care team to coordinate without losing focus on the patient. Even in everyday scenarios, a nurse can instantly connect with pharmacy to answer a patient’s question, allowing the conversation to flow naturally without frustrating delays. This instantaneous, hands-free connection translates directly into faster response times and better patient engagement, which results with better patient satisfaction.

Mobile Computing that Moves with Care

While hands-free communication solves the connectivity challenge, mobile computing addresses the documentation burden that keeps clinicians away from patients.

  • Bedside Documentation: Clinicians can document care in real-time, which improves accuracy, reduces errors, and eliminates the cognitive strain of remembering details to chart later.
  • Safer Medication Administration: Barcode scanning at the point of care electronically verifies the five rights of medication administration (right patient, drug, dose, route, time), and has been shown to significantly reduce medication administration errors and improve patient safety.
  • Point-of-Care Access: Lab results, imaging, and care plans are available when and where clinicians need them, eliminating trips back to the nursing station or logins into multiple systems.

These devices are built for the harsh realities of healthcare, withstanding repeated disinfection and constant use, leading to less downtime and more reliable tools. Hot-swappable batteries allow for full-shift operation without interruption, keeping devices powered through demanding schedules and ensuring seamless handoffs during shift changes.

The Force Multiplier Effect

Reducing workflow friction creates exponential value. If a nurse saves a conservative five minutes per patient interaction through faster communication and bedside documentation, that translates into an hour or more per shift returned to direct care. Multiply that across a hospital, and the capacity gains are substantial.

The impact extends beyond time savings:

  • Reduced Burnout: Less cognitive burden means more mental energy for clinical decision-making. Decision fatigue and error rates decrease.
  • Improved Collaboration: Care teams stay synchronized with faster escalation when needed and smoother handoffs between shifts.
  • Better Patient Outcomes: Faster response times and more comprehensive monitoring lead to earlier interventions, and patients report higher satisfaction when clinician communication is strong.

Technology as a Retention Tool

The staffing crisis is fundamentally a retention crisis. Exit interviews consistently point to frustration with inefficient workflows, feeling unable to provide quality care, and technology that hinders rather than helps. Each of these factors is addressable through better clinical mobility.

The average cost of turnover for one staff RN is over $56,000, making the investment in technology that improves retention not just justifiable but imperative.

Zebra’s platform directly addresses burnout by creating sustainable workflows. When clinicians are equipped with tools that actually work and make their jobs easier, job satisfaction improves. Efficiency gains reduce the need for overtime and extended shifts. By removing the daily friction that erodes professional fulfillment, organizations create environments that attract and retain staff.

The Path Forward: Sustainable Healthcare Through Better Technology

The healthcare staffing crisis shows no signs of quick resolution, and burnout continues to drive experienced healthcare workers away from a profession they once loved. Healthcare organizations must help existing teams work more effectively while creating environments that attract and retain talent – not through surface-level perks, but through fundamental improvements in how work gets done. Technology alone isn’t the answer. But when implemented thoughtfully, the right technology becomes the foundation for breakthrough. Zebra’s Clinical Mobility Platform represents this kind of integrated approach, addressing not just isolated pain points, but the systemic friction that drains clinicians’ energy, accelerates burnout, and erodes patient care.

Ready to explore how clinical mobility solutions can solve your healthcare IT challenges? The healthcare experts at Connection are here to help! Contact your account executive or engage our Healthcare Practice today to start planning your journey toward enhanced patient outcomes and operational efficiency.

]]>
TechSperience Episode 141: Navigating the... https://community.connection.com/techsperience-episode-141-navigating-the-intersection-of-data-and-ai-building-a-robust-ai-strategy/ Oct 31, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/10/3249721-TechSperience-Ep141-Intersection-Data-AI-BLOG.jpg

In this conversation, the speakers discuss the intersection of data strategy and AI strategy, emphasizing the importance of collaboration between different roles within the organization. They outline their 12-month plans for AI implementation, focusing on governance, risk management, and the challenges of data governance. The discussion also touches on the ownership of AI production systems and the principles that guide decision-making in the face of conflict. The speakers highlight the need for a balance between innovation and risk management, as well as the importance of maintaining strong partnerships within the organization.

Speakers:

Jamal Khan - Chief Growth and Innovation Officer, Connection

Scott Sova - Senior Vice President and CIO, Connection

Kipp Alvarez - Principal Data Analyst, Connection

Show Notes:

00:00 Introduction to Data and AI Strategy

08:52 12-Month Plan for AI Implementation

17:45 Balancing Innovation and Risk Management

24:50 Data Governance and Security Challenges

27:21 Managing Shadow AI and Encouraging Innovation

28:46 Vision and Innovation: The Importance of Rechecking Paths

30:34 Navigating Customer Relationships and Risk Management

31:40 Understanding NPU Technology and Its Implications

34:09 Balancing AI Workloads: Cloud vs. Local Processing

35:31 Contractual Considerations in AI Partnerships

38:48 Ownership of AI Production Systems: A Collaborative Approach

43:29 Establishing Governance and Support Structures for AI

46:04 Defining Tiebreaker Principles in AI Decision-Making

50:07 Operationalizing Governance and Risk Management

51:55 Commitments to Innovation and Collaboration

]]>
LIDER ERG Delivers Over 1,000 Meals to... https://community.connection.com/lider-erg-delivers-over-1000-meals-to-seniors-a-hispanic-heritage-month-celebration-rooted-in-service/ Oct 27, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/10/3285521-ConnectionCares-LIDER-ERG-Volunteer-BLOG.jpg

Connection’s LIDER ERG marked Hispanic Heritage Month with a high-impact service event, packing over 1,000 meals for seniors at Feeding Palm Beach in Boynton Beach, Florida.

This wasn’t just a volunteer moment—it was a strategic act of service that aligned with our ERG’s mission and Connection Cares values. The event demonstrated how ERGs can drive impact, build community, and generate ROI, all while honoring heritage and leadership.

The volunteer team included Grisell Martinez, Carmen Ormaeche, Elizabeth Soto, Sally Navarro, Verónica Castro, Jesal Bhatt, Silvia Restrepo, and Jennifer Freites. 

The LIDER ERG continues to show how cultural celebration and community service can be financially viable, socially meaningful, and strategically aligned with company goals.

]]>
TechSperience Episode 140: Signals of... https://community.connection.com/episode-140-signals-of-post-quantum-insecurity/ Oct 21, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/10/3249721-TechSperience-Ep140-Signals-of-Post-Quantum-Insecurity.BLOG_.jpg

In this episode, the Connection Security Center of Excellence Team takes us on a journey—both literal and technical—sharing insights gained from a recent reconnaissance road trip across the United States. Their firsthand experiences offer a front-line perspective on the evolving landscape of post-quantum insecurity. Listeners will hear from our security experts as they unpack the evolving risk landscape that we may be challenged with in a post-quantum world—supported by real world field observations. Join us for a timely and thought-provoking discussion that blends technical insight with real-world reconnaissance that will help us all navigate the post-quantum future.

For more information on how to better secure your environment, visit http://connection.com/cybersecurity. Or, if you’re ready to start the conversation around what Connection can do to help your organization, call 1.800.998.0067.

Speakers:

John Chirillo, Principal Security Architect, Connection

Rob Di Girolamo, Senior Security Architect, Connection  

Kimberlee Coombes, Security Solution Architect, Connection

Show Notes

00:00 Introduction to the Quantum Road Trip

02:50 Objectives and Planning of the Research Trip

05:34 Understanding Post-quantum Security

08:09 Current Preparedness for Quantum Threats

10:45 Real-world Observations and Insights

13:28 Future Predictions and Data Analysis

]]>
2025 Nashville Healthcare Tech Summit:... https://community.connection.com/2025-nashville-healthcare-tech-summit-transforming-innovation-into-action/ Oct 21, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/10/3259871-Nashville-Tech-Summit-Recap-BLOG.jpg

From the first moments at the Gaylord Opryland in Nashville, it was clear: this year’s Healthcare Tech Summit would not be just another conference—it was a pivotal moment for bringing innovation, partnership, and real-world impact to the healthcare IT community. As someone who’s spent nearly two decades in healthcare strategy, I am deeply proud of how Connection and our partners moved the conversation forward on issues that matter most—security, patient/provider experience, cost containment, and the promise (and reality) of AI in health.

A Community United by Vision and Urgency

The Summit brought together more than 100 client attendees, representing 77 unique healthcare organizations across 33 states, and nearly 100 partner guests across 35 leading vendors. The diversity of the attendee base—from hospitals and LTACs to behavioral health, managed services, payor organizations, and manufacturing—mirrored the evolving, interconnected nature of healthcare itself.

What struck me most was the energy in the room: professionals confronting financial pressures, regulatory complexity, talent shortages, and the daily realities of keeping patients safe and data secure—but doing it with purpose, transparency, and hope.

Client Perspectives: The Real-world Priorities

Our pre-event survey highlighted what’s keeping technology, operations, and security leaders up at night: cybersecurity threats, AI regulation, interoperability, cloud vulnerabilities, and system outages topped the list.

When asked about their top IT initiatives, respondents prioritized EMR upgrades, application consolidation, identity and access management, AI in all its forms (agentic, vision, generative), and augmenting staff with managed service solutions. The breadth of projects on the horizon underscored a crucial insight: innovation must be practical, scalable, and always grounded in business and patient impact.

A Program Designed for Collaboration and Action

The opening moments set an inclusive, problem-solving tone. Instead of a passive keynote, attendees dove into a hands-on collaborative challenge session where client, Connection, and partner teams tackled real healthcare IT scenarios and presented actionable solutions. As one peer reflected, “We leave with not just ideas, but next steps.”

Breakout session highlights included:

  • Lenovo: Innovative strategies to elevate the patient experience
  • Chrome OS: Raising standards for access and efficiency
  • Zebra: Enhancing provider workflows and outcomes with modern technology
  • Microsoft: Untangling complexity and driving impact in multi-cloud/hybrid environments
  • Dell: Securing healthcare’s digital future amidst new and evolving threats

I was heartened to see our partners—Zebra, Chrome OS, Microsoft, Dell, Lenovo, and many others—show up not simply as sponsors, but as thought leaders and co-creators. Connection’s subject matter experts helped frame each session in the context of our clients’ toughest challenges—bridging the vital gap between technology possibility and healthcare reality.

Networking with Purpose—and Results

Strategic networking was woven into the fabric of the Summit: structured client-partner “speed dates,” an expo featuring 23 technology booths, and evening sessions focused on peer-to-peer learning and direct connections. A standout moment for me was the customer testimonial captured with St. John’s Health, which reinforced how Connection’s partnership delivers not just value—but transformative results for patient care and operational excellence.

Outcomes and Next Steps

Feedback was clear: attendees valued the Summit’s focus on actionable solutions, authentic dialogue, and takeaways that will shape their digital roadmaps. Partner demand exceeded expectations—so much so that we were waitlisted for both expo and main sessions within days of the first invitation. The creative, relentless outreach of our marketing and events teams paid off in a program that was not only inspiring, but inclusive and relevant.

Yet, we also learned valuable lessons for improving the nomination process, making room for even more emergent tech partners, refining the communication cadence with both clients and partners, and simplifying logistics to make it even easier for healthcare leaders to engage.

Connecting to a Broader Mission

Perhaps most importantly, the Summit underscored that Connection’s role is more than technology matchmaking—we are value creators, educators, and long-term partners. As the stories and priorities that emerged in Nashville proved, Connection’s commitment to healthcare is grounded in “learning the language of healthcare,” growing and supporting our GPO business, and working with clients to solve their most pressing problems—together.

What Comes Next

To every client, partner, and team member, thank you for making the 2025 Nashville Healthcare Tech Summit a beacon for what’s possible when community, innovation, and patient outcomes are at the center. We’re already incorporating your insights as we shape future events and solutions. Planning for 2026 is underway!

If you are looking to turn bold ideas into actionable outcomes, let’s start that conversation now. What should Connection focus on next to help your organization thrive in this changing healthcare landscape? Contact your Account Manager to share your ideas.

Learn more about our Healthcare Solutions and Services.

]]>
Living on the Edge: Securing Retail IT... https://community.connection.com/living-on-the-edge-securing-retail-it-infrastructure-at-scale/ Oct 07, 2025 Brian Gallagher https://community.connection.com/author/brian-gallagher/ https://community.connection.com/wp-content/uploads/2025/10/3203793-MIM-EdgeMgmt-Retail-BLOG.jpg

Retailers are under enormous pressure to deliver faster, more seamless customer experiences. From frictionless checkout and real-time inventory visibility to digital signage and personalized promotions, edge computing is at the heart of modern retail operations. But while edge devices unlock innovation, they also introduce new risks. Every POS terminal, kiosk, and smart sensor deployed on the sales floor becomes another endpoint that needs protection. Without proactive safeguards, these assets can quickly become vulnerabilities.

The Growth and Risk of Retail Edge

By 2025, more than 75% of enterprise data will be processed outside traditional data centers and clouds, up from less than 10% in 2019.

Global investment in edge infrastructure is also expected to reach $350 billion by 2027.

For retail, this means explosive growth in connected devices across stores: self-checkouts, mobile scanners, in-store cameras, and IoT sensors that track everything from customer traffic to refrigeration. But these devices often operate in unattended, decentralized environments, making them harder to monitor, patch, and secure.

Why Retail Edge Is So Vulnerable

Unlike centralized IT systems, retail edge devices are widely distributed across hundreds or thousands of stores. They are physically exposed, with kiosks, scanners, and signage accessible to customers. They are also inconsistent in management, often running on outdated firmware or legacy POS systems.

Traditional perimeter defenses can’t keep pace. A single unpatched POS terminal or unsecured kiosk can open the door to significant breaches.

Building Trust by Design

To secure the edge, retailers need to embed trust into the infrastructure itself. This means rethinking how devices are provisioned, monitored, and protected, including:

  • Hardware Root of Trust to ensure only verified firmware and software run on edge devices.
  • Zero-Touch Provisioning (ZTP) to securely enroll devices at scale using encrypted onboarding methods like FIDO Device Onboarding.
  • Out-of-Band Management to keep recovery and diagnostics available even when devices lose connectivity.
  • Microsegmentation and TLS Encryption to safeguard communication between edge devices and backend systems.

These safeguards move retailers from a reactive posture, responding to breaches after they happen, to a proactive one, where risk is minimized at every layer.

AI as a Forcing Function

Many retailers are already deploying AI at the edge, whether through loss prevention analytics scanning video feeds in real time or predictive inventory systems that track purchasing trends. But these projects often expose gaps: unpatched handheld scanners, under-monitored cameras, or outdated kiosks.

In this way, AI becomes a catalyst for better edge security. By surfacing weak points in infrastructure, it forces IT leaders to modernize how devices are managed and secured.

The Business Case for Securing the Edge

Investing in edge security isn’t just about reducing risk, it directly improves store performance and customer trust with:

  • Higher system availability that keeps checkouts running and signage active.
  • Stronger customer confidence that comes from protecting payment, loyalty, and behavioral data (KPMG Consumer Loss Barometer).
  • Operational efficiency that grows when IT teams automate patching and streamline management across hundreds of locations.

Why Connection?

At Connection, we bring decades of retail IT expertise and strong partnerships with leading technology providers like Dell, Cisco, and NVIDIA. We’ve helped national retailers secure POS systems, IoT networks, and large-scale edge deployments. Our frameworks for automation and trust-by-design give IT teams the tools they need to scale securely across the store footprint.

Final Takeaway

Edge computing is powering the next generation of retail innovation, but innovation without security is a risk no retailer can afford. By embedding trust into every device and making edge security a measurable priority, retailers can transform their infrastructure into a platform for customer experience, efficiency, and growth.

To ensure transparency, please note that artificial intelligence and large language models may be utilized to enhance the content of this article. This approach helps refine and enrich the information presented, ensuring accuracy and depth.
]]>
Unlocking Productivity with Power Automate:... https://community.connection.com/unlocking-productivity-with-power-automate-simplifying-workflows-and-automation/ Oct 02, 2025 Jonathan Moye https://community.connection.com/author/jonathan-moye/ https://community.connection.com/wp-content/uploads/2025/09/3203757-Power-Automate-Blog-Post.jpg

Do you always do the same things at work every day? You’re not alone. In fact, according to a 2023 study, McKinsey Global Institute (2023) estimates 20–30% of activities in knowledge-based occupations (managers, professionals, analysts, etc.) are repetitive and could be automated with current technologies. One such platform is Microsoft’s Power Automate.

What Is Power Automate?

Microsoft Power Automate is a dynamic tool designed to streamline your workflow by automating repetitive tasks and business processes in a low-code environment. With Power Automate, you can easily connect your favorite apps and services, set up intelligent workflows, and watch routine actions like approvals, notifications, and data collection happen automatically. Whether you're freeing up time from tedious data entry or integrating complex multi-step business processes, Power Automate brings powerful automation to everyone, letting you focus on the tasks that provide the greatest organizational impact.

Wait. Are Automation and AI the Same Thing?

Power Automate and AI agents like Copilot serve up productivity in totally different ways. It helps to think of Power Automate as your backstage crew, quietly orchestrating the routine, rule-driven stuff behind the scenes. It’s perfect for building automated workflows that handle repetitive tasks, saving you from manual drudgery. In contrast, Copilot is your quick-thinking digital best friend, powered by generative AI. It understands context, gets your sense of humor, responds to natural language, and jumps in with creative suggestions, summaries, and real-time answers. Power Automate automates the “if this, then that” work, while Copilot interprets intent and bends to meet you in the moment.

How Can I Use Power Automate to Streamline My Work?

 Let’s dive into why you need to use the Power Platform  to make your workday a bit easier.

Ever find yourself downloading the same report every single day? This can be automated with Power Automate! Below, I will walk you through the steps to get that report automated. Let’s give you back that time you desperately deserve.

First up, we would look at the website you are downloading your reports from. After that, we would check to see what API connectors the website has. Then we would set up a custom API with a basic “call,” sometimes referred to as an API “request.”

An Application Programming Interface, also known as an API, allows systems to communicate with each other. A great analogy on how an API works is to imagine yourself at a restaurant, and as the customer you have a request (your food order). When you sit down, the waiter gives you a menu—this is the API documentation or interface. It lists exactly what you can order and what you can retrieve with your request. The waiter is the API itself, as they take your order to the kitchen. The kitchen is the database or server where the information is held. When your meal (API request) is ready, the waiter will bring it back to you. The waiter in the restaurant and in the API request are necessary to get your food and your intended result. 

Once we’ve got that down, we can have the report downloaded to a shared location, such as your OneDrive or SharePoint. Lastly, we would set up automation to send an email to the required recipients. 

You can schedule report delivery in any cadence. For example, you can schedule the trigger for the report to run Monday–Friday, excluding holidays, at 7:00 a.m. Eastern Time. Then, grab your cup of coffee and read the report that automation brought you. You just operationalized an aspect of your job, returning time to your schedule.

Some websites may not have API support, but there are other approaches you can take to automate your work. Some users deploy robotic process automation (RPA) for desktop automation. Although this isn’t as efficient as the other scheduled automation, this can be a bridge between older sets of data that lack API interface.

If you would like to learn more about leveraging the Power Platform to automate your organization’s individuals or enterprise works, contact your Connection Account Team.

]]>
Azure VMware Solution: Hybrid Cloud for... https://community.connection.com/azure-vmware-solution-hybrid-cloud-for-vmware-workloads/ Sep 30, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/09/3199673-Azure-VMware-Migrate-Blog.jpg

Organizations running VMware workloads are under increasing pressure to modernize without disrupting daily operations. Azure VMware Solution (AVS) provides a fully managed, Azure-hosted private cloud that simplifies this transition. Built on VMware’s familiar software-defined data center stack (vSphere, vSAN, and NSX), AVS supports lift-and-shift migrations, scales as business needs evolve, and enables direct access to Azure’s security, AI, and analytics services.

What Is Azure VMware Solution?

VMware is a leading virtualization platform that enables organizations to run multiple applications and operating systems on the same physical hardware through technologies like vSphere, vSAN, and NSX. It makes the familiar VMware environment available on dedicated physical servers hosted in Microsoft Azure. With AVS, organizations can seamlessly migrate and manage their workloads while maintaining operational consistency. Available for both commercial and government deployments, AVS starts with a minimum three-host cluster, providing a flexible foundation for hybrid cloud strategies.

Infrastructure and Architecture

Azure VMware Solution is built on dedicated physical servers in Azure, designed to deliver enterprise-grade performance and integration. Key elements include:

  • Hyperconverged infrastructure combining compute, memory, and storage on dedicated hosts.
  • Flexible host SKUs to match different performance and capacity requirements.
  • Private connectivity through ExpressRoute Global Reach for secure, low-latency links to Azure Virtual Networks and services

Operational Consistency

For IT teams, operational consistency is a key advantage. AVS allows continued use of VMware tools like vSphere, vSAN, NSX, and HCX, while also integrating into the Azure portal. With Azure Resource Manager, Monitor, and Security Center, administrators gain unified management and visibility across VMware and Azure environments.

Core Benefits

Accelerated Innovation with AI and Analytics

Running VMware on Azure means more than just extending infrastructure. AVS unlocks native Azure services—from AI and machine learning to advanced analytics, DevOps pipelines, backup, and monitoring—helping organizations accelerate innovation with their existing workloads.

Enhanced Scalability and Flexibility

AVS clusters scale on demand to meet new workload requirements. Azure’s global data center footprint ensures workloads run close to users, reducing latency and supporting growth in new regions without the need for new on-premises infrastructure.

Security and Compliance

Built on Azure’s trusted cloud foundation, AVS inherits enterprise-grade security and compliance controls. These include encryption at rest, integrated threat protection, and certifications covering GDPR, HIPAA, ISO 27001, and more. For government customers, specialized AVS instances deliver heightened compliance and regulatory alignment.

Simplified Migration Path

AVS supports true lift-and-shift migrations without replatforming. With VMware HCX, organizations can extend L2 networks, use vMotion for live migrations, and maintain existing IP and MAC addresses. This reduces migration risk and preserves current VMware investments.

Cost Efficiency and Predictability

Organizations can optimize budgets with Azure Hybrid Benefit for Windows and SQL Server licenses, free extended security updates, and reserved instance pricing. The operational cost model of AVS helps stabilize expenses while providing predictable growth capacity.

Use Cases and Scenarios

Azure VMware Solution enables a wide range of business scenarios, including:

• Data center consolidation or extension with minimal downtime.

• Disaster recovery and business continuity, using AVS as a secondary site.

• Government-compliant hybrid expansion, meeting strict regulatory requirements.

These use cases highlight AVS’s role in supporting modernization strategies without sacrificing stability or compliance.

Limitations and Considerations

Azure VMware Solution is designed for enterprise-grade performance, but there are a few factors to keep in mind. Each deployment begins with a minimum three-host cluster and supported host SKUs must be used. To get the best value, clusters should be right sized to workload needs, as smaller environments may not fully benefit from the platform. Thoughtful planning around network integration and workload placement helps ensure a smooth and cost-efficient experience.

Moving Forward with Azure VMware Solution

Azure VMware Solution provides a streamlined path to hybrid cloud adoption, allowing organizations to preserve their VMware investments while taking advantage of Azure innovation. Its strengths in scalability, compliance, security, and simplified migration help IT leaders modernize without unnecessary disruption. AVS serves as a flexible bridge between today’s VMware workloads and tomorrow’s cloud-driven opportunities.

Ready to evaluate your hybrid cloud strategy and explore how AVS can support your business? The experts at Connection are here to help! Contact your Connection Account Team today and start planning your VMware-to-Azure journey.

]]>
Navigating the IT Supply Chain: Insights,... https://community.connection.com/navigating-the-it-supply-chain-insights-strategies-and-tech-trends/ Sep 25, 2025 Ayanna Campbell https://community.connection.com/author/ayanna-campbell/ https://community.connection.com/wp-content/uploads/2025/09/3191721-GlobalServe-BLOG.jpg

It’s no secret that organizations are facing tighter timelines, budget pressures, and evolving priorities when it comes to the supply chain. GlobalServe is committed to helping you make confident, informed decisions—whether you’re optimizing spend, managing fulfillment, or preparing for 2026.

Staying Ahead of IT Supply Chain Challenges

Tariffs and Trade Turbulence

Tariff policies have remained a hot-button issue throughout 2025, especially in the IT sector. From Q1 to Q2, companies have had to respond to a wave of policy shifts, often with little notice. While most organizations haven’t overhauled their global supply chains, many are making strategic adjustments to stay agile. The prevailing approach is cautious—monitoring developments closely and preparing to pivot quickly.

To stay ahead, companies should consider:

  • Diversifying supply chains to reduce dependency on any single region
  • Utilizing in-country supply and warehousing to mitigate cross-border risks
  • Implementing robust risk management strategies and sustainability initiatives
  • Planning strategically for stockpiling and alternative sourcing

EEMEA (Eastern Europe, Middle East, and Africa)

  • Red Sea disruptions continue to reshape shipping adding 10–14 days to transit times, spiking freight costs and insurance premiums, and delaying deliveries across EMEA trade lanes.1
  • Iraq’s “Development Road” opens new highspeed TIR corridor reducing transport from Poland–UAE from ~21 to ~10 days, offering faster alternatives to sea freight through Suez or around Africa.2
  • Eastern-Europe transit corridor expansion major rail projects will cut transit times from East Asia to Europe by ~14 days and expand rail-based freight alternatives to sea routes.3

APAC (Asia-Pacific)

  • JPY’s 15% volatility forces 32% of Tokyo-based electronics firms to relocate assembly to Vietnam/Thailand.4
  • Philippine Peso stability (+2% vs USD).5
  • Regional challenges like Taiwan’s water shortages forcing 5nm chip plants to relocate.6

LATAM (Latin America)

  • Infrastructure and Logistics Gaps: Ports in the Caribbean face severe inefficiencies, including handling costs that are 2–3 x higher, low maritime connectivity, and congested infrastructure in smaller ports.7
  • LATAM scores consistently low on Logistics Performance due to inadequate transport and warehousing networks.8
  • Inconsistent trade regulations, limited infrastructure funding, and governance concerns undermine supply chain efficiency.9

North America and Western Europe

  • The Euro-Dollar Rollercoaster: EUR volatility (+8% swings vs USD in 2025) disrupts IT equipment imports, forcing 42% of European tech firms to renegotiate contracts.10
  • Data Center Boom: Hyperscale’s accelerates EU builds to hedge FX risks.11
  • Chip Pricing Chaos: Memory module costs fluctuate ±15% monthly, driving firms to stockpile.12
  • New Tech Corridors: Italy’s Chip Packaging Hub: €3B investment lures 15 Asian suppliers to Milan.13
  • Spain’s Talent Pipeline: Barcelona coding schools now feed 40% of EU cloud startups.14
  • Nordics Green Tech Powerhouse Sustainable IT Scale.15
  • Microsoft’s Arctic DC: 100% hydro-powered, slashing cloud carbon costs by 60%.16
  • Northvolt’s Battery Breakthrough: Cobalt-free cells for data center UPS systems.17

Key Challenges in the IT Supply Chain

Across all regions, companies are grappling with:

  • Rising material costs due to imposed tariffs
  • Political instability and the threat of trade wars
  • Supply-demand imbalances and freight fluctuations
  • Limited resources and infrastructure inefficiencies
  • Currency volatility disrupting contracts and pricing

In EEMEA, Red Sea disruptions have added 10–14 days to transit times, spiking freight costs and insurance premiums.1 Iraq’s “Development Road” offers a promising alternative, cutting transport time from Poland to UAE from ~21 to ~10 day. 2 In LATAM, logistics performance remains low due to inadequate transport and warehousing networks. Governance concerns and limited infrastructure funding further complicate operations. 3

In APAC, regional challenges like currency swings and water shortages are forcing companies to relocate and rethink their strategie.4 In North America and Western Europe, the Euro-Dollar volatility is disrupting IT imports, prompting European tech firms to renegotiate contracts. Hyperscale data center builds are accelerating to hedge against FX risks, while chip pricing chaos—±15% monthly fluctuations—is driving stockpiling.5

Budget Maximization Strategy

GlobalServe works closely with finance and procurement teams to help align budgets without compromising future plans. Here’s how we support smarter spending:

  • Review open POs to ensure on-time fulfillment and avoid last-minute rushes
  • Consolidate smaller orders to reduce freight and administrative costs
  • Identify critical hardware or services that should be purchased early to avoid delays
  • Reallocate underused budgets across regions to prevent overspending in others

Tech Updates

GlobalServe’s integration with ServiceNow and OneSource is designed to streamline procurement and reduce errors. Key features include:

  • Auto-creation of purchase requests that sync directly into OneSource
  • Real-time tracking of procurement status without switching platforms
  • Error reduction through integrated fields and pre-filled data

As more organizations transition to remote work to reduce operational costs, GlobalServe offers comprehensive support to make the process seamless, including:

  • End-to-end logistics for delivering devices to new hires and retrieving equipment from departing employees
  • On-site technical support for remote employees during setup
  • Certified data wiping and hardware destruction for environmentally responsible asset disposal

Let GlobalServe be Your Trusted Partner in Managing IT Assets

Ready to future-proof your IT supply chain to maximize your budget? Engage our GlobalServe Team today to schedule a strategy session with our procurement experts or call 1.888.800.0416 to speak with a trusted solutions advisor to get started.

Sources:

1https://atlasinstitute.org/the-red-sea-shipping-crisis-2024-2025-houthi-attacks-and-global-trade-disruption/

2https://trans.info/en/tir-corridor-via-iraq-413011

3https://www.searates.com/blog/post/eurasian-railway-corridor-overview-of-2024-trends-in-rail-freight-from-china-to-europe

4https://www.nri.com/-/media/Corporate/en/Files/PDF/knowledge/publication/lakyara/2024/02/lakyaravol381.pdf?la=en&hash=896CFAFFA916004111B4AB54F1B991BF50CA00A6

5https://manilastandard.net/business/314635318/ph-stocks-rebound-2-peso-weakens-on-rate-cut-hopes.html

6https://www.taiwannews.com.tw/news/6043167

7https://unctad.org/news/latin-america-caribbean-shipping-struggles-amid-geopolitical-and-climate-crises

8https://blog.solistica.com/en/logistics-in-latin-america-where-are-we-and-where-are-we-going

9https://www.imf.org/en/Blogs/Articles/2023/11/16/how-latin-america-can-use-trade-to-boost-growth

10https://mebfaber.com/wp-content/uploads/2025/08/how-will-european-companies-ever-catch-their-u.s.-peers-deutsche-bank.pdf

11https://dynamicsintl.com/investing-in-eu-data-centers-opportunities-and-risks/

12https://www.trendforce.com/presscenter/news/20250417-12551.html

13https://www.fdiintelligence.com/content/ce05004a-6e45-5201-8317-ccc52c402349

14https://www.barcelonacodeschool.com/

15https://www.forbes.com/sites/sap/2024/04/02/the-nordics-next-frontier-powering-up-green-scaleups-for-hypergrowth/

16https://datacenters.microsoft.com/sustainability/efficiency/

17https://www.carscoops.com/2023/11/northvolt-develops-cobalt-free-battery-cell-that-could-unlock-cheaper-cleaner-evs/

]]>
Harnessing Change: How Connection Is... https://community.connection.com/harnessing-change-how-connection-is-powering-the-future-of-higher-education/ Sep 24, 2025 Bobby Sears https://community.connection.com/author/bobby-sears/ https://community.connection.com/wp-content/uploads/2025/09/3201550-AI-HigherEd-BLOG.jpg

Higher education is at a crossroads. Colleges and universities that were once able to maneuver with careful deliberation now face an accelerating wave of disruption driven by digital transformation, shifting student expectations, and new economic realities. The traditional model of waiting for challenges to arise and responding cautiously is no longer sustainable. To thrive in this evolving landscape, higher education institutions must move from reactive technology approaches to proactive, forward-looking strategies that strengthen resilience and unlock new opportunities.

Technological innovation creates new obstacles—but also new paths to progress. Leaders must confront pressing challenges that go beyond just implementing new tools, such as:

  • Enhancing cybersecurity and risk governance to protect against increasingly sophisticated threats.
  • Improving digital equity and inclusion to ensure students of every background can fully participate in a tech-enabled learning environment.
  • Balancing generative AI innovation with safeguards that protect the integrity and value of education.
  • Strategizing resource allocation and empowering the workforce.

By addressing these interconnected challenges with a proactive lens, higher education institutions can harness change as a strategic advantage for long-term success.

Successful Change Starts with People, Not Technology

Empowering people in how they use technology is the key to sustainable change. For higher education institutions to adapt and thrive in the AI era, the focus must be on building AI fluency and institutional readiness. Progress for higher education starts at the individual level.

As Pat Yongpradit, Chief Academic Officer of Code.org and Lead of TeachAI, noted:

“Teachers are saying, ‘I need training, it needs to be high quality, relevant, and job-embedded…’ In reality, people require guidance, and that means teachers and administrators going through professional development.”

This need for guidance is clear across higher education. Educators and education leaders alike acknowledge that AI is paramount to professional development. Nearly half (47%) of education leaders cited AI upskilling for employees as the top workforce strategy for the next 12 to 18 months, according to Microsoft’s AI in Education Report.

For educators, the stakes are even higher. Their professional development in AI has a compounding effect, shaping not only their own readiness but also the preparedness of the students they teach.

The workforce students will enter is already defined by AI-driven expectations. A strong majority (76%) of global leaders now view AI literacy as a core component of basic education. This means students must develop the ability to manage AI assistants, critically evaluate outputs, and delegate tasks effectively, rather than relying passively on automated tools and habituating dependencies.

But nurturing the requisite skills in students requires educators to possess them first. Therein lies the gap—nearly 45% of global educators report they have received no AI training at all. Without targeted professional learning programs, colleges and universities risk falling behind in preparing both their staff and their students for an AI-powered future.

Bridging the Gap in Higher Education with Connection

To bridge this divide, institutions must invest in high-quality, relevant, and ongoing professional development that equips their people, not just their platforms, to thrive in a world where AI is central to both education and work.

Higher education leaders need a trusted technology partner to help them manage change for long-term success. At Connection, we are committed to empowering institutions through innovation and collaboration, with a people-first mindset.

Higher Education Technology Solutions and Services

As an accredited Microsoft in Education Global Training Partner with 25 years of serving the public sector, Connection provides higher education organizations with specialized technology solutions and services including:

Every institution faces the same waves of disruption, but each navigates them from a different starting point—with its own culture, priorities, and constraints. That’s why a one-size-fits-all approach to technology won’t work. Higher education leaders need strategic solutions tailored to their unique mission, and they need a trusted partner who can align technology with people, process, and purpose to turn vision into reality.

Keep Up with the Latest in Higher Education at EDUCAUSE 2025

Now more than ever, higher education leaders must look ahead for what’s to come. No matter how far off new innovations seem now, the luxuries of today soon become the necessities of tomorrow.

But it’s hard to preemptively prioritize your attention and resources with so many unknowns. Leaders need a space to listen and learn from peers, which is why events like the EDUCAUSE Annual Conference are critical in times of disruption.

EDUCAUSE connects the brightest minds in higher education technology to discover creative solutions for today’s challenges and plan for tomorrow’s possibilities. Now is the time to seek out opportunities for collaboration, networking, and thought leadership.

Find us at EDUCAUSE, where our higher education experts will be showcasing solutions around:

  • Generative AI and academic integrity
    • Cybersecurity and risk governance
    • Digital equity and inclusion access
    • Strategic budgeting and the changing IT workforce

If there’s one thing education leaders and professionals appreciate most, it’s that learning happens best through connection. EDUCAUSE 2025 offers higher education leaders the chance to step away from day-to-day pressures and immerse themselves in forward-thinking conversations that shape the future. By engaging with peers, exploring practical solutions, and sharing your own experiences, you’ll leave better equipped to turn disruption into opportunity.

Harness Change with an Expert Guide

Whether you’re navigating AI’s impact on academic integrity, grappling with cybersecurity threats, striving to expand digital equity, or devising new budget and workforce strategies, EDUCAUSE is the place to find inspiration and be proactive for what’s to come. Join us there, and let’s reimagine together what higher education can achieve in the years ahead. To learn more about Connection’s higher education solutions, connect with a Connection Higher Education Technology Expert and schedule a consultation.

]]>
Upcoming Changes to Exchange Hybrid... https://community.connection.com/upcoming-changes-to-exchange-hybrid-functionality-what-you-need-to-know-and-how-connection-can-help/ Sep 12, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/09/3192471-Microsoft-Exchange-BLOG.jpg

Microsoft is implementing critical updates to Exchange hybrid environments as part of its Secure Future Initiative (SFI), aimed at enhancing security and modernizing hybrid configurations. These changes will directly impact organizations using Exchange Server in hybrid mode with Exchange Online, especially those relying on rich coexistence features like Free/Busy lookups, MailTips, and profile picture sharing.

Key Changes

1. Transition to a Dedicated Exchange Hybrid Application

Historically, Exchange hybrid environments have used a shared service principal (Office 365 Exchange Online) to enable hybrid features. Starting with the April 2025 Hotfix Update (HU), Microsoft is transitioning to a dedicated Exchange hybrid application in Entra ID. By October 2025, all hybrid environments requiring rich coexistence must switch to this dedicated app, as the shared service principal will no longer be supported.

Organizations can make this change by:

  • Installing the April 2025 HU (or later) and running the ConfigureExchangeHybridApplication.ps1 script.
  • Alternatively, using the updated Hybrid Configuration Wizard (HCW), though the script is recommended for robustness.

2. Deprecation of Exchange Web Services (EWS)

Microsoft is also retiring EWS calls from Exchange Server to Exchange Online. These will be replaced with REST-based Microsoft Graph API calls, offering more granular permissions and improved security. This update is expected in Q3 2025, and organizations must adopt it by October 2026 to maintain hybrid functionality.

Who Needs to Act and When

If your organization uses rich coexistence features, you must:

  • Switch to the dedicated hybrid app by October 2025
  • Update to Graph API permissions by October 2026

Failure to do so will result in broken hybrid features, including Free/Busy sharing and MailTips.

Organizations not using rich coexistence should still consider running the Service Principal Clean-Up Mode script to remove legacy certificates and harden their hybrid configuration.

How Can Connection Help

Navigating these changes can be complex, but Connection is here to support you. Our experts can:

  • Assess your current Exchange hybrid setup
  • Guide you through the transition to the dedicated hybrid app
  • Prepare your environment for Graph API integration
  • Ensure compliance and continuity of service

Don’t wait until functionality breaks—contact your Connection team today to schedule a review and stay ahead of these critical updates.

]]>
Microsoft 365 E5: Maximizing Productivity... https://community.connection.com/microsoft-365-e5-maximizing-productivity-and-security/ Sep 04, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/08/3171021-MS365-E5-BLOG.jpg

To thrive in the modern digital workplace, organizations and their employees need more than just the basic tools to get their work done efficiently; they need comprehensive, resilient solutions that can help drive productivity.

Microsoft 365 E5 is an advanced enterprise solution designed to elevate workplace productivity while embedding robust cybersecurity features. With tools for collaboration, automation, compliance, and threat detection, Microsoft 365 E5 enables organizations to streamline workflows and protect against evolving digital risks. As cloud adoption grows, E5 stands out by offering scalable, secure cloud integration that supports modern business operations.

What Is Microsoft 365 E5?

Microsoft 365 E5 is an enterprise license that includes all Microsoft Office 365 features, plus additional security, compliance, and device management capabilities. It’s an ideal solution for enterprise users who want to take advantage of the productivity and security benefits included in the license across various apps like:

  • Word
  • Excel
  • Teams
  • PowerPoint
  • Outlook
  • OneNote
  • SharePoint
  • OneDrive
  • Power BI Pro
  • Defender for Endpoint
  • Microsoft Purview

Microsoft 365 E3 is another enterprise solution with similar capabilities, but without the advanced security and compliance capabilities and Power BI business analytics that comes with Microsoft 365 E5. While Microsoft 365 E3 contains essential productivity tools and features, Microsoft 365 E5 comprises the full range of compliance and productivity tools to help organizations achieve efficiency more quickly and securely.

Key Features of Microsoft 365 E5

  • Microsoft Defender XDR helps organizations protect against advanced security attacks, like phishing and malware.
  • Microsoft Purview Insider Risk Management helps users identify and investigate insider risks and quickly act.
  • Power BI Pro helps deliver insights to enable fast decisions based on business analytics.
  • Microsoft Entra ID P2 provides advanced identity and access management solutions for hybrid and multicloud environments through Microsoft Entra ID.
  • Advanced eDiscovery helps streamline legal investigations and data compliance.

These features work closely together to create a secure, robust environment that organizations can rely on to help them achieve advanced compliance and productivity.

Enhancing Productivity with Microsoft 365 E5

Microsoft 365 E5 is built to elevate productivity through integrated collaboration and communication tools tailored to the modern workplace. Key productivity tools that come with the enterprise license include:

• Microsoft Teams for seamless team interaction
• SharePoint for streamlined content sharing and management
• Microsoft Loop for real-time, collaborative workflows

Each of these tools work together to support hybrid and remote work, ensuring all team members can work closely together, no matter their location or time zone. By unifying messaging, file sharing, and task coordination in a single platform, E5 keeps teams synchronized, responsive, and productive, wherever they are.

Automating Workflows

With Power Automate, users can build custom workflows that eliminate repetitive tasks. Combined with Teams integrations and AI-powered suggestions, E5 enables employees to focus on high-value work instead of manual processes.

Data Analytics for Decision Making

Power BI Pro transforms raw data into actionable insights. Whether it’s sales trends, customer behavior, or operational metrics, leaders can make informed decisions backed by real-time analytics.

Strengthening Cybersecurity and Compliance

Security is another critical component of ensuring successful organizational productivity. With Microsoft 365 E5, it isn’t just an add-on; it’s baked in. The E5 suite offers a layered defense strategy that includes identity protection, endpoint security, and enhanced compliance tools. From phishing attacks to insider risks, E5 is designed to detect, respond, and recover quickly.

This proactive approach helps organizations stay ahead of threats and meet regulatory requirements with confidence.

Advanced Threat Protection

With an E5 license, users also gain advanced threat protection. Microsoft Defender for Endpoint and Defender for Office 365 provide real-time protection against malware, ransomware, and phishing. These tools use AI and behavioral analytics to detect threats before they cause damage, saving IT and security teams valuable time and resources to complete other tasks.

Compliance and Governance

Within E5, Microsoft Purview offers tools like Insider Risk Management and eDiscovery to help organizations manage sensitive data and meet compliance standards. It’s especially valuable for industries like healthcare, finance, and government where data governance is non-negotiable.

Seamless Cloud Integration

Microsoft 365 E5 is also built for the cloud-first world. Its architecture is designed specifically to integrate hybrid and remote-first infrastructures to support everything from mobile workforces to global operations.

So, whether you're migrating from on-premises or scaling cloud-native, E5 adapts to your environment.

Secure Data Mobility

With a Zero Trust architecture, E5 ensures that data is protected wherever it travels. “Remote first” organizations can be confident that they can collaborate securely, wherever they work. Secure file sharing, conditional access, multifactor authentication, and encryption policies keep sensitive information safe across devices and networks.

Scalability and Centralized Management

The Microsoft 365 Admin Center gives IT teams centralized control over users, policies, and licenses. This is especially helpful for organizations with hybrid setups, as E5 helps users access cloud services with their existing credentials through tools like Microsoft Entra ID, which supports directory synchronization. As your organization grows, E5 scales with you, so you won’t have to manage multiple vendors or patch together solutions.

Is Microsoft 365 E5 Right for Your Business?

Microsoft 365 E5 delivers a powerful combination of productivity and protection. E5 is ideal for mid-to-large enterprises that operate in regulated industries, manage sensitive data, or require advanced collaboration tools. If your organization prioritizes security, compliance, and productivity, E5 offers a compelling, all-in-one solution. Connection can provide more information on your options through their tailored security, digital workspace, and multicloud solutions.
If your business is ready to streamline operations, strengthen security, and empower your workforce, E5 might just be the upgrade you’ve been waiting for. In a world where agility and resilience are everything, an integrated, cloud-ready platform like E5 is a game-changer.

The Next Steps Connection offers expert guidance to help you unlock the full value of Microsoft 365 E5, from advanced security to productivity tools. Our team supports you through seamless implementation and ensures your environment is optimized for performance and compliance. Already using Microsoft 365? We’ll review your current setup to identify opportunities for improvement and cost savings. By consolidating tools and licenses, E5 can help reduce expenses while enhancing your IT strategy. Reach out to your Connection Account Team today to get started with a personalized consultation.

]]>
Edge Evolution Powers the Modern Factory https://community.connection.com/edge-evolution-powers-the-modern-factory/ Sep 02, 2025 James Rust https://community.connection.com/author/james-rust/ https://community.connection.com/wp-content/uploads/2025/08/3168808-Modern-Factory-Edge-Management-BLOG.jpg

Edge computing has come a long way since the days of content delivery networks. Originally, it was used to cache static Internet content like pictures and videos closer to users due to slow network speed. Once social media and online gaming rose into the mainstream, Internet usage became more about interactivity rather than retrieving unchanging data and the need for faster response times quickly grew. Local servers now had to not only deliver data but process data that was continuously generated, and the version of edge computing we know today was born.

That was nothing compared to the explosive growth in edge computing we have seen in the past decade. In a world increasingly reliant on up-to-date data and continuous monitoring of business processes, vast amounts of data must be processed. The arrival of cloud computing that once made things so convenient for businesses suddenly became impractical in terms of data and energy usage. As operations have grown more complex, edge computing has evolved to match. Manufacturers face the complicated task of deploying the right mix of machines, from far edge devices that aggregate data directly on the factory floor to edge servers in a local data center that run real-time analytics. Procedures change quickly, and moving data processing to the edge brings unique challenges that must be addressed.

Deployment Doesn’t Have to Be Difficult

Let’s say you’ve made the decision to move your data to the edge on the factory floor or in your server room. You’ve invested in the necessary hardware like edge gateways, robust industrial PCs, and powerful edge servers. The issue is how can you effectively deploy so many devices when your IT team is already swamped with existing duties and managing infrastructure? The solution is to have your new equipment ready to go out of the box.

Zero-Touch Provisioning (ZTP) makes your edge devices truly plug-and-play. Before a device even reaches your site, it’s pre-registered with a central server that holds its exact configuration and necessary software image. Once a ZTP-enabled device powers on, it automatically finds a DHCP or DNS server and downloads the script it needs to configure itself to your precise specifications. This powerful automation not only saves you immense amounts of time but also virtually eliminates the errors that can creep in with manual installations.

Maintenance Is a Breeze with the Right Tools

Once everything’s up and running, you might quickly realize you’re managing a vast number of devices spread across the factory floor or suddenly have far more equipment packed into your on-site data center. The question then becomes how do you effectively maintain all this infrastructure?

Once again, ZTP and our comprehensive platforms handle this for you. Operating from a centralized system, they push firmware updates and system changes to all relevant devices simultaneously, eliminating the need for individual updates. Software patches and updates are delivered over the air, while devices continuously send logs and performance metrics to a central monitoring system. Any anomalies are either corrected automatically or immediately trigger alerts for your IT personnel, ensuring issues are known and addressed the moment they occur. This prevents configuration drift and keeps your schedule clear for more strategic tasks.

If devices do reach end of life or fail beyond repair, ZTP frameworks simplify their removal. They securely de-provision the device, ensuring it’s completely wiped of all data. Any replacement devices can simply be swapped out and loaded with the last good known good image, so the new device boots up and takes its place with no further issues.

Keeping Your Investment Safe

Even with your system optimized and updated, there is still one important question. How do you keep your system secure from threats? Depending on how it’s implemented, edge computing can expand the attack surface and weaken your overall cybersecurity posture.

The best way to keep your new devices secure is by utilizing a Zero Trust Architecture. Your system will operate on the principle “never trust, always verify”, meticulously double-checking every person and device attempting access. It’s also crucial to put a network segmentation plan in place. It’s reasonable to assume that there will eventually be a breach, so dividing the edge network into isolated segments based on various parameters will help prevent any bad actors from penetrating deep into your infrastructure.

When deploying hardware directly onto the manufacturing floor the environment is a critical consideration. Specialized enclosures and ruggedized devices can easily withstand the rigors of a machine shop, and even more advanced protection exists for truly extreme working conditions. What’s most important is to match the right protection for the right setting. Even beyond environmental resilience, remember that devices on the floor are physically accessible to workers. It’s vital to implement a robust access control system and enforce proper use of credentials by authorized personnel in order to keep everything running properly.

Edge Technology Can Be Managed

Harnessing the power of edge computing offers significant benefits, but it also introduces considerable challenges. Choosing which devices and architectural designs are best for your business is overwhelming with today’s IT staffing limitations. Pre-configuration can be complex, especially when your facility needs a large number of diverse machine profiles. Connectivity on the factory floor often poses an issue, potentially requiring substantial network infrastructure upgrades. Even with advanced solutions some ZTP platforms only work on a single vendor’s hardware, so heterogenous environments might have difficulties connecting everything they have.

Setting up modern edge compute infrastructure involves a huge number of factors, but you don’t need to figure it all out yourself. If you’re considering moving your data to the edge, engage our Manufacturing Practice today. We can help you navigate these ever-changing waters and determine the right hardware and software for your unique business needs.

Additional Resources

]]>
Livin’ on the Edge: How Healthcare IT Can... https://community.connection.com/livin-on-the-edge-how-healthcare-it-can-tame-the-chaos-of-edge-management/ Aug 19, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/08/3154480-Healthcare-Edge-BLOG-1.jpg

If you’ve worked in tech long enough, you know we love our acronyms. We invent new ones, recycle old ones, and assign new meaning to the familiar. One term that’s evolved dramatically in recent years is “edge.” Once shorthand for laptops and workstations, it now represents something far more complex and far more critical, especially in healthcare IT.

What Is Edge Management?

To understand Edge Management, it helps to take a step back and look at how computing has evolved. We’ve cycled between centralized processing (mainframes, data centers, cloud) and decentralized models that bring compute power closer to where data is generated.

That’s the edge: local, real-time data collection and processing that improves performance, speeds response times—and supports devices engineered for unstructured data. And in today’s healthcare environment, that edge is expanding rapidly.

Smart Tech at the Bedside

Once smart technology entered healthcare, the edge wasn’t just about devices. It became about experiences. Digital front doors, connected beds, smart monitors, cameras, mobile workstations, and even autonomous robots delivering meals and linens now live at the edge. Add in computer vision for inventory tracking, spatial computing for clinical training, and AI tools that support bedside decision-making, and the edge is no longer a fringe component. It is central to care delivery.

More devices also mean more data. And more challenges for already overextended IT teams trying to balance cost containment, staff shortages, and patient safety.

Managing the Data Tsunami

Most hospitals have experienced slow but steady data creep over the last decade. The result is a mountain of unstructured data scattered across devices that often run on proprietary or disconnected operating systems.

Without clear policies for retention, backup, or recovery, healthcare organizations risk losing visibility into critical information that could inform diagnostics, workflows, and resource planning. That is why edge management is not just an IT priority. It is an operational imperative.

You also can’t talk about data without talking about security. According to the 2025 Verizon Data Breach Investigations Report, only 54% of known edge vulnerabilities were fully remediated across healthcare organizations, and the median patch time was 32 days. Exploitation of these edge devices accounted for 22% of vulnerability-based breach entry points.1

Where to Start: Practical Steps Toward Healthy Edge Management

We couldn’t resist borrowing from Aerosmith’s Livin’ on the Edge:

“Tell me what you think about your situation / Complication, aggravation is getting to you…”

That lyric hits home for many IT leaders trying to wrestle control of a sprawling, unmanaged edge. The good news is that you are not alone. There are tangible steps you can take.

Start by naming edge management as a strategic priority within your IT steering committee. Then define outcome-based goals tied to measurable value: reducing endpoint loss, standardizing security protocols, or decommissioning outdated infrastructure.

Here are a few sobering data points:

  • 60% of healthcare breaches involve the human element, including phishing, credential theft, and social engineering.1
  • Phishing remains the dominant social engineering technique, with new tactics like prompt bombing growing in popularity.1
  • Ransomware appeared in 44% of all breaches, up from 32% the year prior, with many attacks traced back to phishing emails or compromised credentials.1

If you’re launching an AI initiative, use it as a forcing function to evaluate the entire edge environment. Are those smart devices patched? Are logs being monitored? Are you capturing the data they generate in a way that supports clinical goals?

And don’t forget your traditional endpoints. Laptops and bedside tablets might feel “old school,” but they remain some of the most common weak points.

Need Help Getting Started?

Edge management doesn’t have to be overwhelming. Connection’s Healthcare Practice can help assess your current infrastructure, identify gaps, and develop an edge management strategy that supports both IT priorities and clinical outcomes through our solutions and services.

1Verizon, 2025, 2025 Data Breach Investigations Report

]]>
Preparing for Microsoft Volume Licensing... https://community.connection.com/preparing-for-microsoft-volume-licensing-price-level-changes-a-guide-to-the-november-2025-update/ Aug 15, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/08/3172171-Microsoft-Volume-Licensing-BLOG.png

The world of Microsoft licensing is evolving once again, and this time the changes are both sweeping and significant for organizations leveraging the Enterprise Agreement (EA), Online Services Premium Agreement (OSPA), and the Microsoft Products and Services Agreement (MPSA). If you’re a business, IT leader, or procurement professional, the upcoming November 2025 update is poised to reshape the way you evaluate, purchase, and optimize your Microsoft Online Services. At Connection, we recognize the challenges and opportunities these changes present. Our Microsoft Licensing Optimization Services are designed to help customers navigate this transition with confidence, clarity, and strategic insight.

Overview of the November 2025 Price Level Changes

Beginning November 1, 2025, Microsoft will standardize cloud services pricing by removing the existing price level discount structure for Online Services products sold through volume licensing programs. All Online Services products across price levels A-D will be set at a single list price, matching that available publicly on Microsoft.com. This update affects purchases of products such as Microsoft 365, Dynamics 365, Windows 365, Azure (which already did not have price level discounts), along with other security, compliance, identity, and management services.

Key Changes:

  • Differentiated price levels (A-D) for Online Services in volume licensing agreements will be discontinued.
  • Online Services purchased via EA, OSPA (China-specific), and MPSA will adopt a single starting list price, regardless of purchasing channel or previous price level.

Unchanged Aspects:

  • Pricing for on-premises software remains unaffected.
  • U.S. Government and global Education price lists are excluded from these changes.
  • The new pricing applies only upon renewal of agreements or the addition of new Online Services after November 1, 2025.

Customer Impact

This pricing change primarily concerns commercial customers acquiring Online Services through EA, OSPA, and MPSA agreements. Customers with price-protected SKUs may retain their rates until their current enrollment or subscription period ends; however, renewals and new purchases after November 1, 2025 will be subject to the revised pricing.

Connection’s Microsoft Licensing Optimization Services

Guiding You Through Change

Navigating these updates can be complex. Connection’s licensing optimization services are available to assist organizations in understanding the impact of these changes, planning strategies, and optimizing their approach to Microsoft cloud investments.

How Connection Can Help

  • Strategic Licensing Assessments: Comprehensive evaluations of current licensing arrangements to identify potential cost optimization and usage improvements. Assistance in determining the effects of new pricing and recommendations to maximize value.
  • Agreement Renewal Assistance: Guidance throughout renewal processes under new pricing terms, including analysis of alternative purchasing options and support in evaluating transition strategies.
  • Cloud Services Planning: Consultation on cloud adoption strategies that accommodate the unified pricing model, development of roadmaps leveraging Microsoft solutions, and advice on scalability and investment.
  • Cost Management and Forecasting: Use of analytical tools to project budget impacts, uncover savings opportunities, and negotiate terms aligned with organizational requirements.

For further details or assistance related to licensing optimization please reach out to your Connection Account Manager or CNXN-XLOSupport@connection.com

]]>
Securing the Edge at Scale: What You Need to... https://community.connection.com/securing-the-edge-at-scale-what-you-need-to-know/ Aug 12, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/08/3143878-GTM-MIM-EdgeSimplified-BLOG.jpg

As enterprises race to embrace edge computing to drive agility, speed, and real-time intelligence, they’re also exposing a troubling reality: edge security is being left behind. From retail kiosks to remote factories and healthcare sites, today’s edge environments are often under-protected, under-monitored, and under-prioritized. In this blog, we explore how to fix the broken state of edge security, and how modern, automated approaches like DevEdgeOps are essential to scaling safely.

Trouble Is Brewing

By 2025, a staggering 75% of enterprise data will be processed outside traditional data centers and clouds—up from less than 10% in 2019.¹

Edge infrastructure is expanding rapidly across industries, from autonomous manufacturing lines to smart cities, remote healthcare, and logistics hubs. In fact, global edge spending is expected to reach $350 billion by 2027.²

With that growth comes risk. Most edge sites are “unattended IT”—unmanned, intermittently connected, and often overlooked in security strategies. According to Bank Info Security, edge device breaches are surging.3 The sheer scale and heterogeneity of edge fleets make them harder to monitor, patch, and protect. The edge can’t be a security exception; it must be a security extension.                                                                                            

Risk and the Edge

Edge infrastructure introduces a perfect storm of vulnerabilities:

  • Unsecured IoT devices and legacy endpoints
  • Outdated firmware and weak access controls
  • Limited physical safeguards
  • Blind spots in visibility and threat detection
  • Supply chain risks around firmware integrity and device trust

Traditional IT security models—reliant on firewalls, perimeter defenses, and centralized controls—simply don’t work in the world of distributed, dynamic edge deployments.

81% of organizations recognize emerging technologies like AI, IoT, and edge computing present significant data protection challenges.4

Embedding Trust into Every Layer

Fixing edge security starts with rethinking how security is integrated. Rather than bolting on protection post-deployment, leading organizations are integrating it into every phase—from hardware provisioning to application updates.

Key principles include:

  • Hardware Root of Trust: TPM chips and secure boot validate firmware authenticity
  • Zero Touch Provisioning (ZTP): Devices onboard securely with no manual steps, using encrypted credentials and FIDO Device Onboarding (FDO)5
  • Out-of-band Management: Allows for diagnostics and recovery even when systems are offline or impaired
  • Mutual TLS and Microsegmentation: Encrypts communications and isolates workloads to reduce lateral movement

Edge security must cover all layers—device, user, network, data, and application.  And the physical environment still matters: USB lockouts, tamper-evident seals, and port controls remain essential.

You Can’t Protect What You Can’t See

Legacy monitoring tools focused on log reviews won’t cut it. Instead, modern observability stacks built for the edge provide:

  • Real-time telemetry from remote nodes
  • AI-driven threat detection and correlation
  • Unified dashboards for global edge visibility
  • Automated response workflows and compliance logs

This is not just about visibility—it’s about intelligence. With real-time data and context, organizations can preempt incidents before they become breaches.

DevEdgeOps: Securing Scale Without Slowing Down

As edge fleets scale, security must be integrated into automation and lifecycle management. Enter DevEdgeOps: a modern model that applies DevOps principles to the unique challenges of the edge.

With DevEdgeOps, every device:

  • Is automatically onboarded and validated with secure credentials
  • Receives updates and patches based on risk profiles and vulnerability profiles
  • Operates under centralized policy and compliance control

DevEdgeOps platforms—such as Dell NativeEdge or VMware Edge Compute Stack—leverage blueprints and automation templates to ensure secure, consistent deployments across thousands of sites.

A recent hands-on analysis of Dell NativeEdge by TechTarget’s Enterprise Strategy Group concluded that automation features can save up to 79% of the time typically spent on deploying and managing edge infrastructure.6   This level of automation doesn’t just accelerate deployment—it enforces compliance, reduces human error, and scales protection in lockstep with innovation.

Four Critical Steps to Take Now

Whether you’ve deployed 10 edge nodes or 10,000, a proactive security audit is essential. Start here:

  1. Logging and Monitoring: Centralize logs, implement real-time alerts, and monitor for anomalies continuously
  2. Physical and Supply Chain Validation: Confirm device integrity, check firmware signatures, and inspect for tampering
  3. Security Testing: Run penetration tests and scan for misconfigurations regularly
  4. Compliance and Policy Review: Align with regulations like HIPAA, PCI, GDPR, and IEC 62443

Security is not a one-time fix. It’s a process: embedded, enforced, and continuously evolving.

Create a Launchpad—Not a Liability

Edge computing is essential for business agility, real-time intelligence, and future-ready infrastructure. But to unlock these benefits, security must be integrated into every layer, not layered on after deployment.

With DevEdgeOps models, zero touch provisioning, and secure lifecycle management, organizations can:

  • Reduce risk
  • Accelerate deployments
  • Enable real-time, AI-powered services
  • Build trust with customers and regulators alike

Don’t wait until the next breach to act. Secure your edge from the start and scale with confidence. Done right, the edge becomes your most secure, agile, and resilient infrastructure layer.

Take a deeper diveinto this topic:

Sources:
1. Compunnel, 2025, The Convergence of Edge and Cloud: How Edge Computing Enhances Capabilities
2. FutureCIO, 2024, Edge Computing Investments Will Reach 232 Billion in 2024
3. Bank Info Security, 2025, Breach Roundup: Surge in Edge Device Zero-Day Exploits
4. Dell Technologies, 2025, Dell NativeEdge Security Solution Brief
5. FIDO Alliance, 2025, Passkey Pledge 2025
6. Dell Inc., 2025, Unlocking the Edge: The Latest Innovations from Dell NativeEdge
]]>
The 2025 Retail Mid-year Review:... https://community.connection.com/the-2025-retail-mid-year-review-understanding-performance-demographics-and-technology/ Aug 05, 2025 Brian Gallagher https://community.connection.com/author/brian-gallagher/ https://community.connection.com/wp-content/uploads/2025/08/313243-Mid-year-Retail-Review-BLOG.jpg

As we cross the mid-year mark, the retail industry finds itself at a dynamic crossroads. The first half of 2025 has been a story of contrasts—balancing promising growth with economic headwinds and evolving consumer behaviors. To succeed in the second half of the year, retailers must understand the key forces shaping the market: year-to-date performance across sectors, the profound impact of consumer demographics, and the technological innovations that are redefining the shopping experience.

We all know that retail performance YTD has been a mixed landscape driven by a complex mix of influences. While the global retail market is on track to reach an estimated $35.2 trillion in 2025, the journey has not been without turbulence. Recent data from May and June showed a dip in U.S. retail sales, marking the third consecutive monthly decline and signaling a degree of consumer pullback.

Performance has varied significantly across key segments. The most notable performance change might be a more horizontal slow-down in e-commerce performance. E-commerce has remained a powerful engine of growth; however, the “easy” double digit growth available over the past decade is now running closer to 8%. Many retailers, including Target, are actually experiencing flat or negative growth rates in their e-commerce business.

In general, many retail segments have experienced a rollercoaster of performance for various reasons. The important thing to consider is how each brand will embrace change.

One of the largest factors to consider regarding future retail performance is the consumer demographic impacts. Shopping behaviors in 2025 are deeply fragmented across generational, income, and gender lines, forcing retailers to tailor their strategies with greater precision.

The Generational Divides

  • Gen Z and Millennials are now the primary drivers of growth. Gen Z’s spending is expanding at twice the rate of previous generations, and together with Millennials, they are expected to account for nearly half of all luxury goods sales by the end of 2025. This cohort lives online, with 67% of Millennials preferring e-commerce and 34% of young Americans making weekly purchases via social media. They are also values-driven; 64% of Gen Z will research a brand’s ethics and are willing to boycott those that don’t align with their principles.
  • Gen X and Baby Boomers, in contrast, still show a strong preference for physical retail, with 72% of Baby Boomers favoring in-store shopping. This group is less engaged with social commerce and tends to frequent national chains, department stores, and hardware stores.

The Influence of Income and Value

Economic pressures have created a more discerning shopper. About three-quarters of consumers now identify as “denominator shoppers,” focusing primarily on paying less. This has fueled a widespread trade-down to discount retailers like Walmart and TJ Maxx, as well as an increase in the purchase of private-label products, a trend observed across all income levels, especially in the grocery aisle.

There is one constant that every retailer must consider is their implementation of technology to drive future sales growth. For the remainder of 2025, technology will be the critical enabler for retailers seeking a competitive edge, as Gen-Z and Millennial shoppers demand it. The focus is on using innovation to enhance customer experiences, optimize operations, and drive profitability.

The Dominance of Artificial Intelligence

AI is no longer a futuristic concept but a core component of modern retail strategy.

  • Hyper-personalization: AI allows retailers to analyze customer data to deliver the individualized experiences that 71% of shoppers now expect. Generative AI is being used to create dynamic marketing content and power sophisticated virtual assistants.
  • Supply Chain Optimization: AI-powered forecasting can reduce inventory errors by up to 50%, preventing stockouts and overstocking by analyzing sales data, market trends, and economic indicators.
  • AI Shopping Assistants: These AI agents are set to become more common, simplifying the consumer decision-making process by offering personalized recommendations and streamlining product comparisons. According to Salesforce, 75% of retailers will need AI agents in order to compete.
  • Computer Vision: The utilization of existing cameras and deployment of new cameras is changing the total operation of stores. Vision is driving productivity improvements through inventory management solutions, improving loss prevention ROI and delivering new consumer engagement options like smart digital marketing.
  • The Internet of Things (IoT): “Smart stores” are leveraging IoT devices to create a more responsive and efficient environment. Electronic shelf labels allow for dynamic pricing, while RFID tags and smart shelves provide real-time inventory tracking, freeing up staff to better assist customers.

While there may not be a clear view of retail performance for the remainder of 2025, there is a very clear path to long-term success: AI solutions. The consumer is not just accepting of the technology—they are expecting or even demanding it. Brands that align their value with consumer expectation and AI solutions will win this race to sales growth.

]]>
Replacement or Reinforcement: What AI Means... https://community.connection.com/replacement-or-reinforcement-what-ai-means-for-healthcare-work/ Jul 31, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/07/3131448-AI-and-Healthcare-Blog-BLOG.jpg

Is it just me? I ask that question often. When I’m reaching for a sweater while everyone else is in t-shirts. When I’m the only one laughing in a movie theater. Or when some bit of tech in my house stops working and, despite a long career in IT, I can’t fix it on the first try.

I asked it again during a completely unscientific LinkedIn poll about meetings. It turns out 89 percent of respondents spend so much time in meetings, it’s equivalent to flying cross-country every day. We’ve all seen the memes: this meeting could have been an email. This email could have been a text. But lately I find myself asking a deeper question: does this meeting even need me?

That question becomes more pressing as generative AI continues to evolve. In episode 17 of his 100 in 100 AI” series, Jamal Khan, Chief Growth and Innovation Officer at Connection, shared a perspective that stuck with me.

“AI is not just a tool,” he said. “It is designed with a fundamentally different goal, not to assist, but to replicate human cognition. The entire arc of AI research has been about replacing the core of what we think makes work skilled.”

Jamal and I had discussed this in a recent webinar but hearing it again in the series landed differently. For the past two years, I’ve told myself a comforting story. AI won’t replace humans. It will replace humans who don’t use AI. But lately, that story has felt a little hollow. If all it takes is a well-crafted prompt to produce the same outcome, then what makes my contribution meaningful?

I’ve seen more and more “open to work” banners from smart, capable people at respected companies. Maybe “corporate restructuring” is just a tidy way to say, “replaced by AI.” Financial efficiency is a powerful motivator, and agentic AI delivers it in spades.

Some days, I leaf through the 2025–2026 Catalog of Classes from the Huntington Beach Adult School, and wonder if it’s time to pivot to something hands-on. Something where prompts don’t matter, and no machine can take your place. But then I remember why I still believe in the work we do.

So yes, AI in healthcare is about replacement. Some roles, some tasks, and some workflows will be automated. That shift is already happening. According to Forrester, 46 percent of U.S. healthcare organizations are already running generative AI in production at the department level.1 But that is just one side of the story. Forrester also reports that half of the top 10 U.S. health insurers are now adopting AI to strengthen member advocacy roles.2

Not everything can or should be replaced. Human experience still matters. There is immense value in clinical judgment, emotional intelligence, and patient connection. The opportunity now is to use AI to support those strengths, not erase them.

At Connection, we are partnering with healthcare organizations that want to approach AI thoughtfully. We are helping teams explore how vision models, generative tools, and specialized computing can reduce administrative load, improve operational efficiency, and enable more focused care.

If your organization is ready to have an honest conversation about what AI can do and what it should do, we are here. The CNXN Helix Center for Applied AI and Robotics Team at Connection brings a blend of technical insight and real-world healthcare experience to every conversation. Let’s figure out what is worth replacing, what is worth rethinking, and what is worth protecting.

Sources:

  1. https://www.forrester.com/blogs/generative-ai-impact-on-clinicians-bringing-the-fever-down/
  2. https://www.forrester.com/blogs/predictions-2025-healthcare/
https://www.forrester.com/blogs/predictions-2025-healthcare/

]]>
TechSperience Episode 139: RSAC 2025... https://community.connection.com/techsperience-episode-139-rsac-2025-unpacked-expert-reactions-from-the-connection-security-team/ Jul 29, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/07/3129171-TechSperience-Ep139-RSAC-2025-Unpacked-Podcast.jpg

In this episode of the Connection Cybersecurity Podcast, host Kim Coombes is joined by our Security Center of Excellence leaders John Chirillo and Rob Di Girolamo, along with Microsoft Security expert Robin Camirand, to unpack the biggest insights from the RSAC 2025 Conference.

From the rise of AI-powered defenses and identity threats to the growing buzz around quantum computing, the team shares key takeaways, favorite moments, and what these trends mean for the future of cybersecurity. Whether you’re curious about agentic AI, post-quantum cryptography, or just want to hear about goats and code-breaking shenanigans—this recap of the conference has something for everyone.

Speakers

John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection  
Kimberlee Coombes, Security Solution Architect, Connection
Robin Camirand, Inside Solution Architect, Connection 

Show Notes

00:00 Welcome and Overview of RSAC 2025
02:27 Community and Collaboration at RSAC
04:39 Identity Management Challenges
07:58 Zero Trust Principles in Security
08:37 Quantum Computing and Post-quantum Cryptography
10:20 Agentic AI in Security Operations
12:42 Emerging Defenses Against AI-powered Attacks
14:48 Best Practices for Leveraging AI in Security
17:04 Favorite Moments from RSAC 2025
20:25 Summing Up RSAC 2025

For more information on how to better secure your environment, visit connection.com/cybersecurity. If you’re ready to start the conversation about what Connection can do to help your organization, call 1.800.998.0067. Or if you have a Connection Account Team already in place, please reach out.

]]>
Data Security and Compliance: Strengthening... https://community.connection.com/data-security-compliance/ Jul 24, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/07/3116521-MFST-Solutions-and-Compliance-Blog-BLOG.png

Data security is a critical priority for everyone, especially those who manage sensitive information across a variety of workstreams within an organization. Implementing robust security measures is an essential component of safeguarding data integrity and preventing unauthorized access.

Data security includes the protective measures that organizations and their IT teams implement to protect digital information from unauthorized access, corruption, or theft throughout its lifecycle. It plays a crucial role in preserving privacy, ensuring that critical digital assets are shielded from potential breaches.

In the broader context, data security supports cybersecurity and compliance efforts, which help organizations mitigate risks and align with legal mandates. Both compliance and data security are paramount to developing a strong security posture.

Key Components of Data Security

Effective data security is built on foundational components that ensure sensitive information remains secure, even if it’s accessed illicitly. These components include:

  • Encryption
  • Access controls
  • Tokenization
  • Data masking

Backup strategies and data loss prevention mechanisms are also critical, as they allow organizations to quickly recover from incidents. Endpoint protection tools work with these measures to secure devices connected to a network. Data security principles like least privilege and zero trust can also help strengthen security by limiting access rights to only those who absolutely need access, while verifying every transaction within the system.

Technical Controls and Safeguards

Technologies such as firewalls, antivirus software, and encryption are fundamental to data security, and are common safeguards within most IT systems. Key functions include:

  • Firewalls: Monitor and filter network traffic to block unauthorized access and threats.
  • Antivirus programs: Detect and eliminate malicious software.
  • Encryption: Converts data into unreadable formats that are only accessible to authorized users.

Implementing technical safety measures helps strengthen an organization’s defense against cyber threats, enhancing its overall security posture.

Human and Process-oriented Safeguards

While technical safeguards are critical in maintaining data security, no security plan is complete without human safeguards. These can include training users to:

  • Identify phishing attacks
  • Manage passwords (changing them frequently, using multifactor authentication (MFA), etc.)
  • Follow their organization’s standard security protocols

Access management tools can also help enforce secure behaviors by granting permissions based on roles and responsibilities. Additionally, incident response plans and regular audits ensure that security policies are both effective and adhered to, minimizing vulnerabilities arising from human error.

Data Security Standards and Compliance

As mentioned before, part of maintaining data security involves meeting compliance requirements. Commonly recognized compliance standards such as ISO/IEC 27001, NIST Cybersecurity Framework (CSF), and CIS benchmarks provide a structured approach to building secure infrastructures and practices. Additionally, global and regional regulations like GDPR, HIPAA, CCPA, and SOX impose specific obligations for data privacy, reporting, and breach notification. For instance, GDPR mandates that organizations obtain user consent for data collection and report breaches within 72 hours.

Regulatory Alignment and Risk Reduction

Because compliance requirements vary significantly across industries and regions, organizations must stay informed about their unique obligations. Industry-specific compliance requirements, such as those in finance (SOX) or healthcare (HIPAA), necessitate tailored security measures. Meeting these requirements ensures not only regulatory compliance but also the protection of sensitive customer data, fostering trust and reputation.

Achieving and Maintaining Compliance

Compliance may look unique for different organizations, but at the core, organizations should ideally adopt internal best practices for maintaining standards, such as developing thorough documentation and conducting regular access audits. Compliance frameworks and tools, such as those provided by Microsoft Purview, can also help simplify tracking and reporting processes to quickly meet regulatory demands, as well as keep data secure.

Connection offers a variety of cybersecurity compliance solutions to support organizations in their endeavors to comply properly. To learn more, visit Connection’s compliance solutions webpage.

The Role of AI in Secure and Compliant Data Management

AI-driven tools continue to revolutionize data security management through automated threat detection and streamlined compliance tracking. By analyzing vast amounts of data, AI models can identify anomalies and potential vulnerabilities faster than manual methods. However, integrating AI into security processes also introduces risks such as algorithmic biases and potential misuse. Organizations must carefully evaluate and manage these risks to maximize the benefits of AI while maintaining robust safeguards, including:

  • Securing AI datasets, inputs, outputs, and training pipelines, as well as AI deployment, to mitigate risks, such as data leakage.
  • Ensuring adherence to privacy laws like GDPR and CCPA through governance, auditing, and ethical guidelines.
  • Using AI responsibly in security operations to enhance phishing detection, anomaly identification, and predictive threat analysis.
  • Implementing a robust data security management plan with strategies like data classification, strict access controls, and lifecycle management, supported by regular risk assessments and continuous monitoring.

Automation and AI are transforming security management through real-time threat detection and compliance tracking. While AI enhances efficiency, organizations must address risks like algorithmic bias and misuse to maintain robust security safeguards.

Leveraging Microsoft Security Solutions

Microsoft offers a comprehensive suite of security solutions tailored to enhance governance, protection, and compliance. Microsoft Purview, for example, helps businesses manage data governance through discovery, classification, and compliance tracking. Microsoft Purview can also help organizations maintain visibility into their data to stay in alignment with any necessary regulatory requirements.

Microsoft Defender, another Microsoft security solution, can complement this by providing robust endpoint protection and advanced threat intelligence. Microsoft Defender secures endpoints, cloud services, and hybrid environments with real-time threat detection and automated response capabilities and integrates with other Microsoft solutions, such as Microsoft 365, Azure, and Sentinel.

Is Microsoft Defender Good for Enterprise Security?

Microsoft Defender provides strong endpoint and cloud security with automated threat detection and real-time response, making it ideal for businesses protecting their infrastructure. It also integrates with Microsoft Purview to create a comprehensive approach to security and compliance, all within the Microsoft ecosystem.

Connection’s Cybersecurity Services

In an era of increasing cyber threats and regulatory scrutiny, data security and compliance are more critical than ever. By adopting robust security measures, aligning with recognized standards, and using advanced tools like Microsoft Defender and Microsoft Purview, you can strengthen your security posture, while also ensuring regulatory compliance for your organization.

Connection offers a vast array of cybersecurity services to help get you started. These include:

  • Security assessment and testing to help you find and map vulnerabilities, even before they occur, and detect outdated data security policies that could cause network harm.
  • Security compliance solutions to help organizations meet compliance standards based on regional laws and industry standards, as well as update and/or develop security certifications, policies, and documentation.
  • Managed and monitored security services, which help manage security alerts and reports by gaining visibility to security patches on end-user devices.
  • Security technology integration to streamline and optimize security operations to free up time for IT teams to prioritize higher-value work.

To learn more about Connection’s security solutions, connect with a Connection Cybersecurity Expert and subscribe to our newsletter to receive the most up-to-date information on data security.

]]>
Edge Compute Management: Transforming IT... https://community.connection.com/edge-compute-management-transforming-it-operations-for-distributed-platforms-at-scale/ Jul 21, 2025 Cameron Bulanda https://community.connection.com/author/cameron-bulanda/ https://community.connection.com/wp-content/uploads/2025/07/3111121-GTM-MIM-AIOps-Lenovo-BLOG.jpg

I recently hosted an interesting webinar with Torsten Volk (Principal Analyst of Application Modernization, Enterprise Strategy Group) and David Brown (ISG NA Technical Strategist, Lenovo), and I am eager to share some of the highlights from this event. If you are planning to manage compute and data beyond the four walls of a traditional data center—such as with retail stores, manufacturing plants, hospitals, or gas stations—this blog post provides insights we covered in the webinar on why edge infrastructure is critical.

The Edge: Compute Wherever You Need It

Let’s talk about what “the edge” means. David described it simply as any place your infrastructure runs outside of the main data center. And when you stop to think about it, that could be anywhere: the clinic down the street, a remote oil rig in the Gulf, or even the new smart coffee machine in your office kitchen. Why push compute and storage out there? Because local processing cuts latency, enabling your AI models, real-time analytics, and IoT devices to make decisions instantly—exactly when and where you need them.

Distributing servers and sensors to hundreds (or thousands) of locations brings massive complexity. You’ve got different teams spinning up solutions, patching machines, and installing software—often without informing central IT. This can lead to hidden security risks, skyrocketing costs, and a complex setup of siloed tools no one can manage from a single place.

Why Central IT Can’t Ignore the Edge

XClarity, a suite of software designed to manage infrastructure, highlights Lenovo’s “secret sauce.” It starts with a holistic systems management methodology that supports everything from rugged edge appliances to AI-optimized servers. XClarity utilizes Redfish and offers open APIs for easy integration with automation tools (such as Ansible or Terraform). Let’s say you want to deploy an OS update: you define it once in your playbook, and XClarity handles zero-touch provisioning, retries automatically if a site loses connectivity mid-update, and reports back on success or failure.

We also delved into Lenovo’s Open Cloud Automation—LOC-A for short. Imagine a no-code platform that orchestrates the entire lifecycle—including onboarding bare-metal servers, configuring clusters, and pushing out the right firmware and OS images—all without a field technician ever setting foot on site. The time and resource savings are enormous, and local teams can stay focused on their core business rather than wrestling with IT drudgery.

But it doesn’t stop there. In the webinar, I also covered the notion of DevEdgeOps, applying DevOps principles to this wildly heterogeneous fleet of edge devices. DevOps CI / CD pipelines can build your confidence in cloud deployments while seeking to extend similar reliability to every kiosk, rackmount node, and smart sensor you’ve got in the field. Consistent blueprints, centralized monitoring, and policy-driven automation mean you can push updates and trust they’ll land correctly, no matter how unreliable the local network might be.

Torsten commented that the common challenge is “the complexity of modern infrastructure (spreading across data centers, clouds, edge locations, and more) needs to support a consistent application experience. We need data locality to have low latency, which means we have certain data at the edge that requires data-driven decision-making based on that data. Simultaneously, processing needs are increasingly shifting to the edge along with compute.”

AI plays a role as well. XClarity Administrator isn’t just a dashboard; it embeds machine learning (ML) to surface actionable insights. For example, predictive maintenance alerts before a server fails, or automated remediation suggestions when anomalies appear in your telemetry. And the CNXN Helix Center for Applied AI and Robotics is working on next-gen AI tools, so you get a continuously improving platform that learns from every edge deployment.

Tailoring Edge Solutions for Your Vertical Needs

Each business vertical has specific needs. For example, retail chains need instantaneous inventory updates. Manufacturers rely on IoT-driven quality inspection. Healthcare clinics process images locally for faster diagnoses. Connection and Lenovo are teaming up with ecosystem partners such as Intel—which offers its Tiber Edge—and cellular failover providers to deliver full-stack solutions tailored to specific industry needs.

If you’re interested in learning more about managing the edge and how to eliminate the patchwork of one-off scripts, nail down a unified control plane, or bring AI-driven automation to every edge node, watch the full webinar. You’ll hear war stories from largescale edge rollouts and learn which emerging tools will simplify your work.

Take a deeper dive into this topic:

]]>
Transforming Care: How AI Is Powering... https://community.connection.com/transforming-care-how-ai-is-powering-patient-centered-healthcare/ Jul 15, 2025 Jamal Khan https://community.connection.com/author/jamal-khan/ https://community.connection.com/wp-content/uploads/2025/07/3092571-Transforming-Care-BLOG.jpg

The transformation of healthcare through artificial intelligence (AI) is no longer theoretical. It’s here and evolving rapidly. While many conversations around AI focus on technical feats, a more profound shift is underway: AI is enabling a new model of patient-centered care—one where technology doesn't replace clinicians but empowers them, reduces administrative burden, and allows more time and focus on the patient experience.

AI is Creating a Shift from Systems-centered to Patient-centered Care

For decades, the healthcare industry has been mired in system inefficiencies. Electronic health records (EHRs), though critical, have added layers of complexity and burden to the provider-patient interaction. Physicians spend more time inputting data than engaging with their patients. But AI is helping to alleviate this pain point.

Voice recognition, natural language processing, and sentiment analysis are being integrated into healthcare workflows to automate documentation and capture context during patient interactions. Instead of typing through appointments, physicians can focus on listening, engaging, and making informed decisions with AI summarizing and structuring clinical notes in the background.

Enhancing Outcomes with Effective Data Orchestration and Management

However, despite these benefits, one of the most pressing challenges in implementing AI is data orchestration. Healthcare organizations typically operate across multiple disconnected systems, from their EHR system to imaging platforms, and more. These systems produce vast amounts of data in varied formats, much of it unstructured or inconsistent.

Data orchestration involves bringing all this data together into a cohesive, usable structure. This includes digitizing paper records, normalizing formats, tagging datasets for accessibility, and building data lakes that can serve as the foundation for AI modeling. Even so-called “dirty” or unstructured data can be valuable when used correctly, as AI models can extract patterns from it that humans might never detect.

In parallel, synthetic data—realistic but artificially generated data—is gaining traction as a privacy-preserving way to train AI models. When done well, it enables healthcare organizations to experiment, iterate, and innovate without risking patient confidentiality. But poor synthetic data quality has been shown to produce unreliable results, so investments in high-fidelity data generation and validation are essential.

Reducing Administrative Burden to Improve Patient Care

One of AI's most immediate benefits is reducing repetitive tasks, such as documentation, billing codes, and appointment scheduling. These tasks drain time and energy from healthcare professionals. Virtual assistants, for example, are making it easier for patients to schedule appointments, access test results, or ask questions, while freeing up frontline staff for more critical tasks.

This shift is especially significant in light of ongoing physician and nursing shortages. Any technology that can reclaim time for human connection is not just a convenience. It’s a necessity for sustainable care delivery.

Enabling Proactive, Personalized Care

AI's potential goes beyond administrative tasks and workflow automation. The real promise lies in making care more personalized and predictive. By integrating disparate data from EHRs, wearables, genomics, and even patient-reported outcomes, AI can help identify subtle trends in a patient’s health trajectory that might be missed in routine visits.

Tools that detect sepsis or stroke based on subtle early warning signs are already in use. Because these conditions often follow predictable symptom progressions, AI can flag them earlier, improving outcomes and reducing long-term costs.

More advanced systems are also analyzing population-level health data to identify patterns of chronic illness, environmental risk factors, and disparities in care delivery. That insight enables health systems to intervene earlier and allocate resources more effectively at both the individual and community level.

How to Use AI Responsibly for Optimal Patient Care and Trust

Patient-centered healthcare must also empower patients in understanding how their data is used. Patients want and deserve to know when AI is part of their care, how it works, and what data it draws from. Informed consent protocols and AI regulations are constantly changing, as healthcare leaders must navigate a growing array of guidelines and mandates, such as:

  • The EU AI Act: One of the most comprehensive efforts globally to regulate AI use by risk category, with direct implications for developers and deployers of clinical algorithms.
  • U.S. Executive Orders and FDA Guidance: U.S. frameworks are evolving to address algorithmic transparency, bias mitigation, and patient safety—all of which will impact how AI is evaluated and approved.

Healthcare organizations will find themselves exposed if they are not careful to incorporate governance, fairness, and explainability into their AI deployment strategies. These organizations will be better positioned to build trust—and avoid costly fines and setbacks.

AI Transparency Means Better Care for All

Beyond compliance concerns, AI transparency and bias mitigation are the key to equity in the age of AI-powered care. If left unchecked, AI systems trained on non-representative data can reinforce biases, resulting in unequal access to care or lower-quality outcomes for certain populations. Establishing governance standards, performing regular audits, and including diverse voices in AI design are crucial steps toward ethical, bias-free AI. 

Building a Future Where the Patient Is Truly at the Center

AI is not a silver bullet. But when strategically applied, it is a powerful tool to reorient healthcare around the patient. By automating the mundane, synthesizing complex data, and illuminating hidden patterns, AI enables providers to focus on what truly matters: human connection, clinical excellence, and compassion.

As healthcare leaders consider AI investments, the most important question isn’t just what technology can do—it’s how it can bring us closer to care that sees, understands, and serves every patient more fully. For more insights and updates on AI in healthcare, visit our dedicated healthcare page at www.cnxnhelix.com/healthcare.

]]>
TechSperience Episode 138: Elevating... https://community.connection.com/techsperience-episode-138-elevating-patient-centered-care-with-intelligent-healthcare-technology/ Jul 02, 2025 Jamal Khan https://community.connection.com/author/jamal-khan/ https://community.connection.com/wp-content/uploads/2025/06/3088022-TechSperience-Ep138-Helix-AI-Healthcare-BLOG.jpg

Join Jamal Khan and Jennifer Johnson as they explore the evolving landscape of AI in healthcare, focusing on its applications, ethical considerations, data privacy, and the role of Chief AI Officers. This discussion highlights the importance of governance, patient consent, and the potential of AI to improve healthcare workflows while addressing data security challenges. Learn about how to implement AI responsibly for better healthcare outcomes and operational excellence. 

Speakers:

Jamal Khan, Chief Growth and Innovation Officer, Connection and Head of the CNXN Helix Center for Applied AI and Robotics

Jennifer Johnson, Director of Healthcare Strategy and Business Development at Connection

Show Notes:

00:00 The Evolution of AI in Healthcare
03:04 Ethics and Governance in AI Application
06:05 Data Privacy and Security Concerns
08:49 The Role of Chief AI Officers
12:07 Patient Consent and Data Usage
14:54 AI’s Impact on Healthcare Workflows
18:00 Computational Power in Health Data Analysis
20:47 Virtual Assistants in Healthcare
24:00 Clinical Trials vs. Drug Discovery
26:55 The Future of Patient Data Management
28:11 AI Adoption in Insurance Companies
33:05 Transparency and Explainability in AI
37:28 AI Use Cases in Healthcare
44:10 Cloud vs. On-prem AI Solutions
49:23 Data Orchestration in Healthcare

For more information on AI services for healthcare, visit https://www.cnxnhelix.com/healthcare.

]]>
Future Proof Your Business with Microsoft Azure https://community.connection.com/future-proof-your-business-with-microsoft-azure/ Jul 01, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/06/3087071-Microsoft-Azure-Webinar-Blog-BLOG.jpg

Keeping up with ever-evolving cloud technology is imperative for businesses to thrive. That’s why we’re excited to introduce our four-part Microsoft Azure webinar series, designed to empower you with the necessary knowledge and tools to embrace the future of cloud security.

In this blog, we’ll dive into the first session of the series, “The Azure Advantage: Be Future-ready with Microsoft Azure,” where we explore how Microsoft Azure’s scalable, secure, and innovative cloud solutions can help your business adapt and thrive in a rapidly evolving digital landscape. Stay tuned as we uncover the key takeaways, including the Azure difference, how to future proof your business, some Azure use cases, and information on our Connection CSP+ Program.

The Importance of Future Proofing in Today’s Business Environment

One of the first topics discussed in the webinar is how enterprises continue to face challenges in choosing the best Cloud Service Provider (CSP), many times due to licensing complexities and outdated software and expired data centers. Future-proof technology is also a hot topic, with IT teams’ need to focus on implementing the most cost-effective and secure technology within their organization. And, with constant changes and updates across different cloud providers and services, it can be challenging to choose one.

But we are reminded time and again that cloud adoption prevents the need for constant hardware upgrades, which can become costly and time-consuming to make. In turn, this can help organizations achieve scalability, an important component of future-proofing your IT landscape.

Microsoft’s Adaptive Cloud Approach to Security

Microsoft Azure is a truly unique platform that covers over 60 regions and over 300 data centers worldwide. With over 190 network points of presence, Azure cloud services provide reach and resilience to businesses that utilize it. Performance, security, and reliability remain at the root of Azure’s cloud services, and migration to Azure is a key step for businesses to become AI-ready.

Plus, Azure meets you where you’re at with an adaptive cloud approach. Azure enables you to:

  • Migrate and modernize apps, data, and infrastructure to easily adopt IaaS or flexible PaaS services for industry-leading performance, availability, and reliability.
  • Move VMware as-is, which allows you to use Azure VMware technology and team expertise.
  • Extend to hybrid, multicloud, and edge to unify siloed teams and distributed sites into one operation, security, application, and data model with Azure Arc.

How Microsoft Azure Enables Business Resilience and Growth

Robust Security and Compliance

Azure takes a proactive approach to safeguard customers’ data from emerging and increasing cloud security threats through built-in security measures that strengthen cloud workloads from the start. Azure’s security intelligence can help identify threats before they occur so users can respond quickly.

With Azure Cloud, end-of-life servers gain extended security updates through services like Azure Arc, which enables IT teams to make extended, on-demand security updates. Servers that have migrated into Azure receive three years of security updates for free.

Driving Innovation with AI and Machine Learning

AI is shifting the way businesses operate, including how they migrate to the cloud. A report from IBM found that 75% of CEOs believe that the organization with the most advanced generative AI will have a competitive edge.[1] This is because AI can help expedite mundane, time-consuming tasks, such as data entry or digging for information, so that businesses can focus on more meaningful tasks, like collaborating with colleagues and spending more time with customers.

For example, in retail, AI can provide data-driven insights for workers to focus on sustainability efforts, like optimizing their supply chains to reduce product waste. And in healthcare, providers can use AI tools to analyze individual patient data to make informed treatment decisions based on their patients’ unique needs.

But AI is not enough for businesses to succeed; they need the right infrastructure to use AI effectively and successfully, and Microsoft Azure Cloud can help power this initiative.

Azure infrastructure offers a variety of transformative AI solutions, including Azure AI Studio and Azure Machine Learning, which helps users train and deploy custom AI models. These services are backed by enterprise-grade security to ensure data is protected from the start.

Cost-saving Offers and Licensing

Microsoft Azure is revolutionizing how businesses effectively manage costs and compliance by offering a flexible licensing option to pay as you go, for products like SQL Server and Windows Server 2025. This allows customers to use what they need, without worrying about making long-term, costly commitments.

Connection and Azure Use Case

One of the key indicators of success with Azure is a good case study. In one example shared in the webinar, a technology manufacturer needed to manage their rapid growth within the AI industry. As a result, they were experiencing rising costs of on-premises virtualization, as well as a shortage of skilled staff that could perform cloud data automation. They wanted a hybrid on-premises virtualization stack to support their costly, on-premises work. Connection helped the manufacturer offset costs and utilize Azure Kubernetes Service (AKS), application containers, and data automation tools to implement a hybrid architecture that worked with their existing Azure stack to streamline processes. The project resulted in automation code improvements and on-premises to cloud migrations performed at scale.

In another example, an energy provider was looking to modernize their legacy, on-premises IT platform and lower operational costs. Connection created a Microsoft Modern Workplace to migrate the customer to Microsoft 365 with OneDrive, Teams, and SharePoint. They used Microsoft Purview to conduct a data governance review of the customer’s data loss prevention and compliance standards. Finally, they were able to improve their security posture by remediating the risks they found with a new Microsoft security architecture, while also lowering operational costs and enabling the customer to self-manage their new environment.

CSP+ Program at Connection

As an Azure Expert Managed Service Provider (MSP) and a Microsoft Partner, Connection is uniquely qualified to architect, deploy, and manage your digital estate to offer the best-in-class managed services and deliver value from a range of Azure services and solutions, including:

  • Cloud architecture design
  • Hybrid cloud implementation
  • Migration and optimization
  • Azure Arc, IoT, Migrate, Storage

On top of our experience supporting Azure services, we provide a CSP+ Program which offers customers support in cloud management and includes services like:

  • Financials and billing
  • Onboarding services
  • Account support
  • Lifecycle management
  • Azure readiness assessment, advisor hours, and more

Customers can also add additional services, including an insights and optimization platform, a dedicated product implementation services team, and engage in an Azure migration assessment.

For more information about our CSP+ Program, visit our website. Stay tuned for another blog on the next webinar in our Azure series!


[1]“CEOs Embrace Generative AI as Productivity Jumps to the Top of Their Agendas.” The IBM Institute for Business Value. 32nd edition. 27 June 2025. IBM Study: CEOs Embrace Generative AI as Productivity Jumps to the Top of their Agendas

]]>
Why Healthcare Needs UCaaS Now More Than Ever https://community.connection.com/why-healthcare-needs-ucaas-now-more-than-ever/ Jun 26, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/06/3083951-GTM-HC-WhyHCNeedsUCaaS-BLOG.jpg

Unified communications as a service (UCaaS) is rapidly becoming essential in healthcare, offering scalable, secure, and patient-centric solutions. As healthcare organizations strive to modernize operations, improve patient outcomes, and adapt to hybrid work models, UCaaS stands out as a transformative tool.

At Connection, we understand that effective communication is pivotal to delivering quality care. UCaaS isn't just about connecting people—it’s about revolutionizing how care teams collaborate and engage with patients.

Modern Healthcare Requires Modern Communication

Healthcare professionals operate in complex environments that demand flexibility and rapid response. Traditional communication systems often fall short, leading to inefficiencies and potential risks.

UCaaS platforms like Fusion Connect, Microsoft Teams, and RingCentral integrate voice, video, messaging, and collaboration into a unified, secure solution. They enable:

  • Rapid care coordination: Instant connectivity across departments accelerates decision-making and improves patient outcomes.
  • Enhanced patient engagement: Secure messaging and virtual visits provide patients with convenient access to care.
  • Support for hybrid teams: Non-clinical staff can work remotely without compromising productivity or security.

UCaaS in Action: Real-world Use Cases

Streamlining Emergency Department Communications with RingCentral

A hospital’s emergency department implemented RingCentral’s unified communications platform to enhance telehealth services. This integration allowed for seamless communication between patients and staff, improving interactions and reducing costs. The platform’s capabilities enabled the emergency department to deliver superior patient-staff interactions virtually.1

Coordinating Home Health Services via Microsoft Teams

Home Health in Microsoft Teams facilitates the creation and management of home health cases, including scheduling home visit appointments. Care team members can access patient information and coordinate services efficiently, ensuring continuity of care for patients managing chronic conditions at home.2

Modernizing Front Desk Operations with Fusion Connect

A nationwide healthcare provider operating 400 locations partnered with Fusion Connect to upgrade their communication systems. By transitioning to Fusion Connect’s hosted voice services, they achieved significant cost savings—approximately $1.2 million annually—and improved service delivery across all locations.3

Scaling Patient Support with AI-Enhanced UCaaS

Healthcare organizations are increasingly adopting AI-powered UCaaS platforms to handle high volumes of patient inquiries. These systems can automate tasks such as appointment scheduling, prescription refills, and answering frequently asked questions—thereby freeing up staff to focus on more complex patient needs.4

Addressing Healthcare’s Unique Challenges

Healthcare organizations face stringent compliance, security, and uptime requirements. Our curated UCaaS offerings include vendors with deep healthcare experience and robust HIPAA-compliant architectures.

  • Fusion Connect: Offers tailored compliance features and integration with Electronic Health Record (EHR) systems.
  • Microsoft Teams: Widely adopted across health systems, supporting telehealth workflows and integrating seamlessly with Microsoft 365 environments.
  • RingCentral: Provides advanced call analytics and contact center capabilities, ideal for high-volume patient service operations.

At Connection, we don’t just implement solutions—we help you navigate vendor complexity, ensure alignment with your IT and clinical goals, and bring scalable communication frameworks to life.

The Future: Integrating AI into UCaaS

The integration of AI into UCaaS platforms is poised to further transform healthcare communication. Emerging features include:

  • Real-time sentiment analysis assists patient-facing staff in understanding and responding to patient emotions.
  • AI-powered coaching and recommendations enhance call quality and staff training.
  • Agentic AI solutions independently handle patient inquiries at scale, improving efficiency and patient satisfaction.5

These innovations are not limited to contact centers—they’re beginning to reshape care navigation, patient triage, and staff training. As these technologies evolve, UCaaS will serve as the foundational layer enabling them.

Ready to Transform Your Communication Systems?

Whether you’re aiming to modernize legacy phone systems, support a hybrid workforce, or pilot AI-assisted patient engagement, UCaaS offers a proven path forward.

Engage Connection’s healthcare experts to help you identify the right platform, unlock funding opportunities, and build a roadmap that aligns with your long-term goals.

Let’s make healthcare communication as advanced as the care it supports.

  1. https://www.ringcentral.com/us/en/blog/telehealth-in-the-er-can-it-work/
  2. https://learn.microsoft.com/en-us/dynamics365/industry/healthcare/use-home-health
  3. https://www.fusionconnect.com/resources/case-studies/healthcare-provider
  4. https://www.vonage.com/resources/articles/ai-for-healthcare/
  5. https://www.nojitter.com/ucaas/how-ai-is-impacting-communication-collaboration-and-workflows-in-the-healthcare-market

]]>
Announcing AMD EPYC™ 4005 Processors https://community.connection.com/announcing-amd-epyc-4005-processors/ Jun 25, 2025 Derek Dicker https://community.connection.com/author/derekdicker/ https://community.connection.com/wp-content/uploads/2025/06/3091501-Blog-Post-AMD-EPYC-4005-BLOG.jpg

Affordable, enterprise class server solutions for small and medium businesses and dedicated hosting providers

As a server business leader at AMD, I get the privilege of sharing exciting innovations that truly meet our customers’ needs. Today, I’m thrilled to introduce the latest addition to the AMD EPYC family: the AMD EPYC™ 4005 Series processors.

These new processors are purpose-built to support small and medium-sized businesses (SMBs), dedicated hosting providers, and branch office IT environments that need cost-effective compute power that is reliable and easy to deploy. In a market long dominated by a single vendor, we are disrupting the status quo with compelling price-to-performance ratios and total cost of ownership (TCO) that make AMD a strategic choice for these verticals.

Advancing AMD EPYC 4000 series

The AMD EPYC 4005 Series is part of the AMD Enterprise Class CPU offerings, building on the previously released 4004 family. It’s based on the ‘Zen’ architecture designed to deliver high performance and efficiency. With core counts ranging from 6 to 16, TDP options between 65W and 170W, up to 192GB DDR5 ECC memory support, and PCIe® Gen 5 connectivity, the 4005 Series is designed for workloads that demand a balance of power, scalability, and affordability.

These processors power servers that are incredibly easy to deploy, making them an excellent fit for SMBs that don’t have deep IT benches or large budgets for consulting services. Ideal for running office-based workloads, building out branch infrastructure, or managing hosted IT services, these processors deliver reliable performance.

Advancing Innovation: Why Now Is the Time to Switch

A lot of organizations stick with the status quo simply because it is what they have always done. But in doing so, you may be missing out on real business benefits. For example, in addition to leading performance and energy efficiency, AMD EPYC 4000 series processors offer up to 16 cores, allowing you to maximize your Windows Server® Standard or Datacenter license which is sold with a base of 16 cores. Competitive solutions max out at 8 cores—leaving half of your license unused.  The additional cores can also provide headroom to grow with your business without constant reinvestment.

Built for the Real World

We’ve designed the 4005 Series with today’s real-world use cases in mind:

  • Office infrastructure: Quiet, power-efficient towers for offices without dedicated server rooms
  • Remote office/ branch office: Reliable compute that scales as your teams grow
  • Dedicated hosting: Energy-efficient performance that helps reduce operational costs

OVHCloud, a leading hosting provider, is already deploying EPYC 4005 CPUs following their success with our 4004 Series. Their need for high performance, affordability, and energy efficiency made AMD the clear choice. That same value proposition is what we’re bringing to SMBs across the globe.

Advancing Reseller and MSP Growth

EPYC 4005 processors revolutionize offerings for resellers and managed service providers (MSPs), empowering them to deliver enhanced performance, scalability, and efficiency. 

  • Highly competitive price points for entry-level systems
  • Enterprise-class features (RAID, ECC memory, AVX512)
  • Leadership core density in class, optimized for Windows Server licensing

There are also real opportunities to build high-margin services on top of this platform. Whether you’re onboarding new clients, supporting a wider range of applications, or delivering AI capabilities, EPYC 4005 gives you the flexibility to scale without sacrificing profitability.

The Ecosystem Is Here

AMD EPYC 4000 series-based servers are available now from trusted OEM and ODM partners like Altos, ASRock Rack, Gigabyte, Lenovo, MiTAC, MSI, and Supermicro. Whatever your configuration needs, the ecosystem is ready.

Compatibility Without Compromise

We know compatibility can be a concern, especially in environments with legacy software or hardware. AMD EPYC 4005 processors are x86 compatible, offering seamless deployment for existing workloads. They are certified on popular Server OSes like Windows Server, RHEL, SLES, and Ubuntu®, and are supported by leading OEM/ODMs to help ensure the systems integrate easily into your IT environment.

With the launch of the EPYC™ 4005 Series, we’re bringing the power of EPYC innovation to a broader audience than ever before. Designed for small and medium-sized businesses, hosting providers, and IT leaders, this platform combines enterprise-grade performance, energy efficiency, and exceptional value—all in a right-sized solution.

Now is the time to reimagine what’s possible for small business IT. Whether you’re modernizing infrastructure, scaling services, or simply demanding more from your investment, EPYC 4005 delivers uncompromising performance and flexibility to help you grow with confidence.

Don’t settle for yesterday’s limits. Step into what’s next—powered by AMD. 

Ready to learn more about the advantages of AMD EPYC? Contact your Connection Account Team today!

]]>
The Indispensable Human Element in Software... https://community.connection.com/the-indispensable-human-element-in-software-asset-management/ Jun 24, 2025 Seth Mitchell https://community.connection.com/author/seth-mitchell/ https://community.connection.com/wp-content/uploads/2025/06/3082721-Human-Element-in-SAM-BLOG.jpg

In the rapidly developing landscape of technology, the role of software asset management (SAM) has become increasingly significant. As organizations strive to optimize their software investments and ensure compliance, the human factor in SAM remains essential. This article explores the necessity of maintaining a human touch in SAM practices, drawing parallels to the importance of human verification in the use of generative artificial intelligence (AI), a major talking point at this year’s IAITAM Conference in Las Vegas.

Recently, I was tasked to seek and review SAM utilities. It wasn’t surprising to learn that today’s Forrester-recommended tools are far more automated than the SAM publishers of our not-too-distant past. Software asset management has undergone significant transformations over the years. From its early days of managing on-premises infrastructure to the current focus on cloud-based solutions, SAM has continuously adapted to meet the changing needs of organizations. The evolution of SAM has been marked by the transition from deploying agent-based tools to scan for software installations to a more comprehensive approach that includes cloud infrastructure management. This shift underscores the need for skilled professionals who can navigate the complexities of modern SAM.

My colleague Casey Lindsay’s blog emphasizes that it is the people behind the tools, turning the dials, who are the most important part of an effective SAM strategy, despite advancements in SAM technologies. This sentiment resonates with the broader principle that human oversight is crucial in ensuring the certainty and reliability of technology-driven processes. 

Remember those new SAM tools I just evaluated? Their automation absolutely 100% needs real people—subject matter experts—to verify accuracy and identify areas where savings can be achieved creatively. Niche volume licensing metrics, you see, are neither recognized nor applied by the automation. SAM analysts play a vital role in interpreting data, making informed decisions, and ensuring that software license entitlements are appropriately assigned. Our expertise is essential in navigating the intricacies of licensing agreements, understanding the unique needs of the organization, and implementing strategies that optimize software usage while ensuring compliance.

The 2016 blockbuster film Sully—a true story—illustrates this concept in a unique setting. Tom Hanks’s character, Captain Chesley Sullenberger, stands in front of the National Transportation Safety Board (NTSB) to plead his case as commander of the doomed airliner. NTSB investigators immediately suggest pilot error as the cause of the river ditching by repeatedly demonstrating that computer simulations successfully landed the Airbus 320 on the runways of nearby airports. Captain and First Officer point out that the computer simulations fail entirely to account for the human factor. A mere 35-second delay was then built-in to the simulations to allow time for human pilot decision-making. Neither computer thereafter could replicate their prior runs.    

The necessity of human involvement in SAM can be related to the importance of human verification in the use of generative AI. While AI has made significant strides in generating content, it is not infallible. Human oversight is crucial to verify the accuracy of AI-generated content, ensuring that it meets the required standards and is free from errors or biases. 

In another real-world example, I recently met with my primary care physician for an annual check-up. He placed his iPhone on the countertop near us and stated that AI would be taking notes of our visit. Naturally, I had questions. Come to find out, the app leveraged Microsoft Copilot. Neat! Doc assured me that a review of the Copilot summary is required for accuracy prior to documenting my record. Nonetheless, the tool is effective in “reducing brainpower needed for the mundane task,” he pronounced with excited relief. 

Similarly, in SAM, human analysts are needed to verify that software installations are compliant with licensing agreements and that entitlements are correctly assigned. This verification process is critical to avoid costly compliance issues and to ensure that the organization is making the most efficient use of its software assets.

One of the key responsibilities of SAM analysts is to establish a current IT estate. This foundational step allows organizations to identify gaps, optimize their software usage, and ensure compliance. Human expertise is paramount to this process, as it involves interpreting complex data, understanding the nuances of licensing terms, and making strategic decisions that align with the organization’s goals. SAM analysts also play a role in vendor management, leveraging their knowledge and experience to negotiate better terms and maintain positive relationships with software publishers. This human touch is essential in achieving cost efficiency by ensuring that the organization derives maximum value from its software investments.  

As technology continues to evolve, the future of SAM will be marked by increasing complexity and the need for continuous adaptation. AI-driven SAM tools can analyze historical usage data and recommend adjustments to license agreements, but human oversight will remain essential to ensure that these recommendations are implemented effectively and in a manner that aligns with the organization’s objectives. Casey and I highlight the importance of human expertise in supplementing classical SAM engagements with trusted advice to ensure compliance, optimize investments, and achieve efficiency. This synergy between technology and human expertise is the key to navigating the complexities of modern SAM maturity and achieving sustainable success. 

Contact your Connection Account Manager to engage with us in support of your licensing goals—or learn more about our SAM and Microsoft Landscape Optimization services. 

]]>
Modernizing Manufacturing Communication: The... https://community.connection.com/modernizing-manufacturing-communication-the-case-for-ucaas/ Jun 18, 2025 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2025/06/3079658-Manufacturing-UCaaS-BLOG.jpg

The world of manufacturing is evolving fast. From connected machines to mobile field service, the expectation for seamless, secure, and flexible communication has never been higher. Yet many manufacturers are still operating with outdated voice systems, siloed conferencing tools, and a patchwork of legacy hardware. That’s where Unified Communications as a Service (UCaaS) steps in.

At its core, UCaaS is about giving manufacturers the modern communication foundation they need to work smarter, not harder. Whether it’s consolidating call platforms or simplifying remote collaboration, UCaaS helps IT leaders streamline infrastructure and empower users across every function.

It’s Not Just for the Office

In manufacturing, UCaaS isn’t just a back-office solution. While most plant floors aren’t wired for headsets and hotdesks, the rest of the enterprise—from logistics and engineering to procurement and sales—relies on dependable, connected tools to get things done.

That’s why many manufacturers are shifting to modern UCaaS environments, where calls, meetings, and messaging flow through one unified interface, accessible from any device, anywhere.

And as companies face pressure to do more with less, consolidating voice, video, and collaboration tools into a single platform is becoming a strategic move. It’s not just about cost savings—it’s about creating a more agile and secure infrastructure.

Ready for the Modern Meeting Room

Across manufacturing, IT teams are racing to modernize outdated meeting rooms. The goal? Walk in, push a button, and start collaborating—no adapters, no delays, and no IT tickets.

UCaaS plays a key role in enabling that experience. By aligning cloud-based communication tools with integrated AV setups, companies can make hybrid collaboration effortless, whether they’re meeting with suppliers, customers, or colleagues.

But Is the Infrastructure Ready?

Before UCaaS can deliver its full potential, the supporting infrastructure must be in place, including:

  • Up-to-date conference room hardware
  • Endpoints and mobile devices that support the preferred platforms
  • Security protocols for encrypted communication
  • Integration pathways with Microsoft 365, Webex, or other enterprise tools

Manufacturers that have grown through M&A or operate across multiple sites often find that UCaaS also helps unify fragmented environments—reducing complexity, eliminating redundancies, and setting the stage for future innovation.

Elevating Customer Experience with AI and UCaaS

The next evolution of UCaaS is already here—and it’s being powered by artificial intelligence. For manufacturers with customer-facing teams, support centers, or distributor networks, AI-enabled UCaaS platforms are opening the door to dramatically improved customer experience.

UCaaS platforms can now include intelligent call routing, real-time sentiment analysis, and AI-driven recommendations that assist human agents during live calls. In more advanced deployments, agentic AI can even handle full conversations autonomously reducing wait times and improving service consistency without sacrificing quality.

For manufacturers, this means more efficient support, better visibility into customer needs, and the ability to scale interactions without scaling headcount.

A Few UCaaS Solutions in the Field

Connection works with a wide range of UCaaS providers to match each manufacturer’s technical requirements, compliance needs, and budget. A few examples, include:

  • AudioCodes offers Microsoft-certified voice solutions that integrate natively with Teams, supporting secure communication in highly regulated sectors like defense manufacturing.
  • Cisco Webex remains a trusted UCaaS platform for organizations prioritizing robust voice, video, and contact center features, especially those already invested in Cisco networking gear.
  • 8x8 provides a flexible, scalable platform that blends UCaaS with CCaaS (Contact Center as a Service), ideal for distributed teams and customer-facing departments needing advanced analytics and uptime guarantees.

Whether the goal is tighter integration, AI-powered customer experience, or scalable conferencing, UCaaS is more adaptable—and powerful—than ever.

Compliance Matters—Especially for Government Contractors

For commercial manufacturers, standard UCaaS platforms offer strong encryption and meet most enterprise needs. But for organizations handling controlled unclassified information (CUI) or delivering to the U.S. government, additional scrutiny is required.

In those cases, Connection helps evaluate UCaaS options that support platforms like Microsoft GCC High or meet compliance frameworks like NIST and CMMC—ensuring communication stays both seamless and secure.

The End Goal: Simplicity, Security, and Flexibility

Ultimately, UCaaS is about creating a unified experience—one that empowers employees, simplifies IT management, and delivers measurable value.

It’s not flashy. It’s foundational.

And when done right, UCaaS becomes a strategic enabler—not just a communication tool.

Why Connection

At Connection, we help manufacturing organizations modernize communication infrastructure with solutions that align to their business goals, not just a single vendor. Our partnerships span trusted providers like AudioCodes, Cisco, 8x8, Kore.ai, and Microsoft, allowing us to tailor UCaaS strategies that fit your specific workforce, compliance, and customer experience needs.

If you’re ready to consolidate your communications environment and support your workforce wherever they are, engage our manufacturing practice to help you take the first step—with the right partner and platform to match.

]]>
Staying Ahead of System Failures: How AIOps... https://community.connection.com/staying-ahead-of-system-failures-how-aiops-can-safeguard-healthcare-operations/ Jun 12, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/06/3073021-AIOps-Healthcare-BLOG.jpg

In healthcare, when systems fail, it’s not just frustrating. It can be dangerous. Whether it’s an EMR, a secure messaging tool, or even an infusion pump, system disruptions can interrupt care, delay treatments, and increase compliance risks. As healthcare operations become more complex, staying one step ahead of these problems becomes even more critical. This is where AIOps comes in.

The Quiet Risk in the Background

While most conversations about healthcare technology focus on enhancing patient care or supporting clinicians, many operational leaders are laser-focused on something more foundational: compliance, governance, and cost containment. It’s not that patient care isn’t the top priority. It’s that we can’t deliver great care without stable, predictable systems behind the scenes.

Unfortunately, many organizations still rely on traditional monitoring tools that only flag issues after they have already occurred. These tools often generate large volumes of disconnected alerts, leaving IT teams overwhelmed and unsure of where to focus their attention. AIOps is designed to change that.

So What Is AIOps, Really?

AIOps (artificial intelligence for IT operations) is more than a buzzword. It’s a shift in how we manage infrastructure. Instead of waiting for something to break, AIOps platforms use historical patterns and real-time data to predict when a system is likely to fail. This gives IT teams the chance to respond before an incident impacts care.

For healthcare, this kind of foresight is powerful. It reduces unplanned downtime, frees up IT resources, and improves compliance performance, all without requiring huge overhauls or budget overextensions.

For example:

  • Cisco DNA Assurance monitors networks in real time and provides step-by-step recommendations to resolve problems quickly.
  • Dell APEX AIOps offers visibility across IT systems and highlights potential risks before users are affected.
  • HPE OpsRamp helps manage infrastructure across multiple locations through one simplified platform.

A prominent healthcare organization leveraged IBM’s AIOps solutions to enhance their IT operations. By implementing IBM’s AI-driven tools, they achieved a 70% faster decision-making process for resource allocation. This improvement led to more efficient management of critical healthcare services, ensuring better patient care and operational efficiency.1 This helped their IT staff prioritize the right issues, improve system uptime, and reduce delays in patient care.

But AIOps isn’t just about enhancement. It’s about replacement too.

Facing the Reality of Automation

I recently heard from a colleague that an IT manager and a systems admin at one of our partner organizations were being replaced by AIOps bots and an outsourced help desk. It was a jarring reminder that we’re not just augmenting human teams anymore. We’re starting to automate away some of the roles we used to rely on.

And while that brings obvious efficiencies, it also brings a sense of unease. What’s the balance between humans and machines? Are we equipping our teams with the skills they need to stay relevant in this new model? These are hard conversations, but we must be willing to have them.

Where Connection Comes In

At Connection, we help healthcare organizations understand what AIOps can do and where to begin. We work with trusted partners to design solutions that fit your environment and goals.

Sometimes that means deploying solutions to automatically detect and resolve wireless network issues. Other times, it involves using dashboards to simplify visibility across multiple locations. Whatever the case, we focus on practical, scalable solutions that help teams work more effectively.

Looking Ahead

The use of AI in healthcare is growing rapidly. The global AI healthcare market grew from 1.1 billion dollars in 2016 to more than 22 billion dollars in 2023.2 Nearly 25% of U.S. hospitals are already using predictive analytics powered by AI to improve operations and care delivery.3

AIOps is part of that trend. It is expanding into areas like cybersecurity, capacity planning, revenue cycle workflows, and clinical data analysis. Organizations that take the time now to explore AIOps and build a strong foundation will be the ones best positioned to thrive in the future.

If you’re not sure where to start, you’re not alone. But staying in place is the one decision that will not help. Engage Connection’s Healthcare Practice to see how we can support smarter, more resilient healthcare operations for your organization. 

Sources:

  1. https://www.ibm.com/aiops
  2. https://www.dialoghealth.com/post/ai-healthcare-statistics
  3. https://www.aiprm.com/ai-in-healthcare-statistics/
]]>
Reimagining Manufacturing IT: Building a... https://community.connection.com/reimagining-manufacturing-it-building-a-smarter-safer-more-agile-industrial-future/ Jun 11, 2025 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2025/06/3072271-GTM-ISG-Manufacturing-BLOG.jpg

Manufacturing is at a pivotal moment. Industry 4.0 has evolved from buzzword to baseline, and leaders are now tasked with enabling flexible, resilient operations at scale—often across hundreds of sites, each with its own mix of legacy systems and modern platforms.

We see a common thread in the conversations we’re having with manufacturers across the country: It’s time to modernize the foundation. Not just for the sake of transformation, but to support real-world needs like safety, uptime, data-driven decision making, and security.

That’s why we’re focused on helping manufacturers shift to composable, edge-ready architectures—infrastructure that’s secure, scalable, and optimized for rapid deployment. These environments give both IT and OT teams the tools they need to innovate safely, drive automation, and monitor performance in real time. They also help bring shadow IT efforts out of the dark, giving engineering teams a supported path to experiment and scale.

Why Edge Is Now

Edge computing has become essential for manufacturers. From real-time safety systems to predictive quality analytics, more workloads are moving to the point of use. This reduces latency, avoids costly data egress, and allows teams to act on insights when and where they matter most. Whether it’s AI inferencing on a line or data acquisition for digital twins, edge is where the action is.

For instance, Baxter International implemented edge computing by deploying wireless sensors to monitor equipment vibration and temperature. This approach enabled real-time anomaly detection, reducing unplanned downtime and maintenance costs across their global manufacturing sites.1

But managing edge at scale is no small feat. Many operations leaders are juggling deployments across remote sites with limited local IT support. That’s where tools like AIOps—artificial intelligence for IT operations—come into play. With the right platform, manufacturers can automate infrastructure monitoring, speed up incident response, and ensure uptime across distributed environments.

Smarter Operations with AIOps

Manufacturers are increasingly adopting AIOps to help:

  • Monitor and respond to IT/OT incidents in smart factories
  • Predict infrastructure failures to minimize downtime
  • Automate anomaly detection across supply chain systems
  • Optimize edge performance in remote and rugged environments

AIOps gives teams the visibility and automation they need to manage complexity and keep operations running smoothly.

A real-world example is General Electric’s implementation of AI-driven predictive maintenance. By analyzing sensor data, GE was able to anticipate equipment failures before they occurred, leading to significant improvements in efficiency, cost savings, and operational uptime.2

Security as a Foundation, Not an Afterthought

Of course, none of this works without a strong security foundation. Today’s manufacturers face a layered threat landscape—one that includes traditional cybersecurity risks, as well as emerging AI-related vulnerabilities like prompt injection or untrusted model outputs.

To address this, we’re helping manufacturers adopt advanced security tools like Cisco Hypershield for micro-segmentation and AI Defense for monitoring AI risks. These tools extend protection beyond the network, down to the kernel, virtual machine, and container level—essential for securing modern edge workloads.

For example, IronHeart Manufacturing enhanced its cybersecurity posture by implementing Cisco Hypershield. This solution provided deep workload visibility and enforcement, allowing the company to detect malicious behavior and control lateral movement in case of an attack.3

Access control is also top of mind. Managing who can interact with edge systems—whether it’s internal teams or third-party vendors—requires federated identity frameworks, protocol-specific controls, and full auditing. Alignment with standards like ISA/IEC 62443 and NIST is critical to maintaining both compliance and operational integrity.

The Bottom Line

Modern manufacturing demands a new kind of IT strategy—one that’s agile, secure, and built to support innovation at the edge. Whether you’re modernizing brownfield environments or building greenfield sites from the ground up, Connection can help you design and deploy an infrastructure that delivers on today’s priorities while preparing for what’s next.

If you’e ready to bring visibility, speed, and security to your industrial IT strategy, engage our Manufacturing Practice today to get started.

  1. https://aws.amazon.com/solutions/case-studies/baxter-monitron-case-study/
  2. https://www.linkedin.com/pulse/ai-case-study-saturday-predictive-maintenance-alastair-ppt6e
https://www.cisco.com/c/dam/en/us/products/se/2025/4/Collateral/cisco-hypershield-ebook-architecture-use-case.pdf
]]>
TechSperience Episode 137: Quantum Computing... https://community.connection.com/techsperience-episode-137-quantum-computing-threats-and-countermeasures/ May 28, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/05/3056521-TechSperience-Ep137-Computing-Threats-BLOG.jpg

Quantum computing represents a significant shift in computational power, offering both opportunities and challenges for securing sensitive data. Join the Connection Security Center of Excellence team as we delve into the transformative threats that quantum computing poses for cybersecurity and the countermeasures we should be considering to address those threats.

We’ll explore the fundamentals of quantum computing, including concepts like qubits, superposition, and entanglement, and how these principles can enhance encryption and threat detection. Quantum cryptography, quantum random number generation, and quantum machine learning are some of the promising applications that could revolutionize device security.

However, quantum computing also poses risks to traditional cryptographic methods, potentially compromising data integrity, authentication protocols, and long-term data security. We’ll discuss the need for quantum-resistant cryptographic algorithms and the importance of transitioning to quantum-safe technologies to protect against future quantum-enabled threats.

Join us as we navigate the exciting yet complex landscape of quantum computing and security, highlighting the advancements, challenges, and future directions that will shape the approach to cybersecurity in the quantum era.

For more information on how to better secure your environment, visit connection.com/cybersecurity

Speakers
John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection 
Kimberlee Coombes, Security Solution Architect, Connection
Lindsay Nelmes, Microsoft Solution Sales Executive, Connection

Show Notes
00:00 Introduction to Quantum Computing and Cybersecurity
02:31 Real-world Impacts of Quantum Computing
04:49 The Timeline for Quantum Threats
06:29 Industries at Risk and Proactive Measures
08:36 Understanding Quantum Resistant Algorithms
09:59 Leveraging Quantum for Cybersecurity Improvements
12:08 AI and Quantum Computing Synergy
14:07 Microsoft’s Role in Quantum Security
15:30 Future Milestones in Quantum Computing
18:02 Misconceptions About Quantum Computing
20:12 Final Thoughts and Takeaways

]]>
The Evolution of the Microsoft Partner Ecosystem https://community.connection.com/the-evolution-of-the-microsoft-partner-ecosystem/ May 22, 2025 Casey Lindsay https://community.connection.com/author/casey-lindsay/ https://community.connection.com/wp-content/uploads/2025/05/3048256-Evolution-of-the-Microsoft-Partner-Ecosystem-BLOG.png

With the dust settling from the recent celebration of Microsoft’s 50th birthday in April, I thought it would be timely to sit down and put my thoughts on paper. Having worked for a Microsoft partner for the past 20 years, I’d like to discuss the incredible journey that we have experienced within the Microsoft partner ecosystem. As we set forth into Microsoft’s 51st year, one thing I know we can continue to count on in partnership with Microsoft: change will continue—and partner adaptation will be the key to our collective success. 

Let’s take a look at the partner and customer experience as they stand today, what has changed, and why it has changed. I also think it is important to dig in and help understand the logic behind why Microsoft continues to make changes like this in terms of the role of the partner community.

Adapting Partner Strategies to Support Customer Success

For Microsoft partners, gone are the days of celebrating the sale of the Microsoft contract. As the ink dries on these contracts, partners must now shift focus to the services we can deliver to support our customers in their consumption of these technologies and their ongoing digital transformation journey. Partners have had to evolve and build a deep bench of Solution Architects and Engineers to design and execute customers’ deployment strategies and priorities. 

To ensure measurable success, partners must establish a roadmap for the customer to document their deployment priorities of these Microsoft technologies. Partners must then quickly pivot to engage these Architects and Engineers to help execute this mission in the form of professional services. 

It is equally important for the partner to determine areas of duplicate technology spend to help optimize and justify what is typically an increased investment in the Microsoft stack. This does two things, (1) builds an internal business case for the customer and (2) helps document the story that they’ll take to Microsoft to negotiate a new contract term. 

Essentially, partners now must be in lockstep with the customer every step of the way on their journey to ensure lasting success and measurable return on their Microsoft investment. To compound all these factors, and with the diminishing financial return on these Microsoft contracts, partners have started to charge for these advisory services to be able to continue to support these vital Microsoft resources in-house. Partners are offering consulting services, typically in conjunction with the strategic decisions at renewal time. As I look out at the partner landscape, I’m seeing these advisory services coming in the form of point-in-time consulting, typically delivered under an executed SOW. I am also seeing partners offer managed services to support the customer over the contract lifecycle, providing checks and balances all along the way.  

In all cases, this changes the game, and now customers must be prepared to pay for these kinds of services on top of the Microsoft contract. This will have a compounding effect, one that will result in a significant mindset change and one that has required adaptation on the part of the partner and especially the customer!

What Does This Mean for Customers?

For the customer, the navigation of product releases, product updates, and pricing changes will continue to be par for the course. After all, this is Microsoft, and one thing we can always count on is change! But this shift in partner focus is a big adjustment and not one that I expect customers to take lightly. Investing in these professional services and documenting this business justification will expedite customers’ time-to-value, help them to get more out of their Microsoft investment, and realize ROI at a much faster pace. 

I expect that the investment that customers make in these kinds of services will pay for themselves many times over in the form of optimization and negotiation posture. This experience with their Microsoft partner will help to establish stronger footing and build a business partnership beyond the Microsoft stack, both with the partner but also with Microsoft directly. Customers will see their Microsoft partners more heavily engaged and in rhythm with them on their technology journey, essentially no longer just showing up when their contract is up for renewal. Over time, customers will grow accustomed to these billable professional services to be a part of the partner ethos as it relates to their Microsoft investment and technology strategy.

Understanding Microsoft’s Strategic Shift in Partner and Licensing Models

Why has Microsoft made these profound changes in how they pay their partners and made selective decisions to take some of these contracts direct? First, it is important to know that change with these Enterprise Agreements (EAs) is not new. On July 1, 2016, Microsoft increased the minimum seat count for an EA from 250 to 500 users, which came with its own turbulent ride. More recently, on January 1, 2025, Microsoft announced the start of an EA “phase out” affecting select Level A EA customers (<2,400 users). This change was designed to drive these customers to consider a direct model with Microsoft called the Microsoft Customer Agreement (MCA) or the option to move to a partner-led Cloud Solution Provider (CSP) agreement. 

With either of these moves, the customer is going to feel the impact in the way of loss of programmatic discounts tied to EA Levels B, C, and D. Additionally, if a customer maintains “From SA” subscriptions, they will not be able to carry these programmatic discounts into the MCA or the CSP agreements, at least as it stands today. Lastly, the EA also carries with it inherent benefits in the form of price locking and the ability to negotiate contract terms and pricing, which would also likely be eliminated with these moves to MCA or CSP. It’s hard for me to foresee the guardrails on these kinds of rules being removed, but I guess anything is possible!

I believe that these changes encourage an indirect pivot in partner strategy, encouraging them to focus on selling professional services that correspond to the Microsoft investment. This strategy will help these customers consume these technologies at a more rapid and measurable pace. Microsoft has changed the game in a big way with the way they pay their partners, and although change can be uncomfortable, I can understand their logic. 

While the sale of the license is still important, it’s the partner follow-through that has become the shining moment. In today’s climate, partnering with the customers as they embrace these technologies to increase business efficiencies, resulting in positive outcomes and increased profitability, is now what calls for celebration. We have seen this as an evolution that has forced the partner community to adapt. And the hard truth is that the partner profitability on these EAs has diminished over time, which has forced this pivot in business strategy to stay relevant. 

To level the playing field, Microsoft partners have had to change the way that they do business due to the challenges to maintain these Microsoft resources on their own payroll. It has become commonplace to see Microsoft partners charging for advisory services and the ongoing management and administration of these Microsoft contracts—not to mention the mission to move customers from EA to CSP Agreements, which can offer more flexibility and financial optimization for the customer.

Embracing Change

To conclude, the road ahead will continue to have its bumps and curves. There is no doubt that customers that prepare and adapt will find themselves well-positioned for success in the future. It’s highly important for customers to stay informed and engaged with both their Microsoft partner and Microsoft directly. This is a journey that will come with challenges, and as long as the partner stays engaged with their customer—and the customer engaged with their partner—the opportunities to mitigate risks and capitalize on the benefits are better than ever! 

Partners must lead with a proactive approach to support their customers in staying ahead of these big strategic decisions. By doing so, customers will be better prepared, have the opportunity to learn about and implement new emerging technologies, and maintain a competitive edge in the market. The byproduct of this journey together is a level of trust unlike anything we’ve seen in the past. We’re in the midst of a technological revolution, especially with the advent of AI and machine learning. The elite partners have a chance to stand out by acting as the customer’s advocate and guide them in a sherpa-like way, with every step, to ensure each decision that is made is properly calculated and measured for success! Contact us today to learn how we can help you navigate the complex world of Microsoft licensing. 

]]>
Navigating the Transition from Windows 10 to... https://community.connection.com/navigating-the-transition-from-windows-10-to-windows-11-a-focus-on-security/ May 21, 2025 Ashley Lofaro https://community.connection.com/author/ashley-lofaro/ https://community.connection.com/wp-content/uploads/2025/05/3049671-SRI-Post-Event-Nurture-BLOG.jpg

As the end of support for Windows 10 approaches, IT professionals are gearing up for the transition to Windows 11 Pro devices. This shift, while significant, brings a host of benefits and improvements, particularly in the realm of security. In this blog, we’ll explore the key points discussed in a recent webinar hosted by James Hilliard, featuring insights from Rob McGilvery of Microsoft and Eric Chong from Intel.

Understanding the Timeline

The clock is ticking, with October 14, 2025 marking the end of support for Windows 10. It's crucial for IT departments to be aware of this date and ensure that their systems are updated to Windows 11 to avoid any disruptions.

Adoption Rates and Benefits

Enterprises are leading the charge in adopting Windows 11, with over 50% having made the transition. Small businesses, while slightly behind, are expected to catch up as they refresh their devices. The new user experience in Windows 11 has been well-received, with users spending more time on their PCs and reporting higher satisfaction.

Successful transitions have treated Windows 11 as a feature update to Windows 10, following existing servicing strategies. It’s essential to understand which devices can receive the update and plan for refreshing those that cannot.

Security Enhancements

Security is a major focus in Windows 11 Pro devices, with Microsoft adopting a chip-to-cloud approach to reduce security incidents and protect users from modern cyber threats. Microsoft is committed to putting security above all else, with devices that are secure by design and secure by default. 

Additional Security Enhancements

Microsoft chip-to-cloud security approach includes three core pillars: secure by design, secure by default, and secure operations. By raising the base hardware requirements and turning on virtual-based security features by default, Microsoft has seen a 62% reduction in security incidents on Windows 11 Pro PCs compared to Windows 10 PCs.

Intel’s Meteor Lake-based platforms have evolved to help IT support the latest advancements from Microsoft in terms of Windows 11 Pro devices.

In the new class of Copilot+ PCs with Intel’s Lunar Lake chip, Microsoft has integrated the Security processor update—now delivered quickly through Windows Update. Intel’s vPro machines feature runtime BIOS resilience, which validates the firmware using certificates and hardens the memory address where the firmware is loaded. This prevents vulnerabilities during the boot-up process, ensuring that the machine is secure even before the operating system loads.

Windows 11 introduces passkey support, allowing users to log in using biometrics such as facial recognition. This enhances security by reducing reliance on passwords and protecting sensitive information. Users can also encrypt files with their biometrics, adding an extra layer of security.

Security vendors are leveraging AI to detect deep fakes, which can be run locally on the NPU. This improves performance and response time, providing a 300% improvement in detecting AI-generated audio or video.

Windows 11 introduces Windows hot patching, which allows for security updates without end-user disruption. This reduces the need for restarts and enhances productivity by delivering security updates in the background.

Conclusion

The transition to Windows 11 is not just about upgrading an operating system; it’s about enhancing security, improving productivity, and ensuring a smooth user experience on new modern Windows 11 Pro devices. IT professionals should leverage the insights shared to prepare their organizations for a successful transition.

Additionally, Connection’s Professional Services Team offers a comprehensive Windows 11 Readiness Assessment to help organizations prepare for the transition. This assessment evaluates device compatibility, identifies hardware gaps, and provides tailored recommendations to ensure optimal functionality of Windows 11 in your environment. By leveraging the expertise of Microsoft specialists and a large ecosystem of Windows 11 Pro device vendors, Connection ensures a smooth transition and maximizes productivity. Their full scope of lifecycle services includes deployment of new assets, reverse logistics, warranty and repair services, and disposal of old assets.

Click here to listen to the Navigating the Transition from Windows 10 to Windows 11: A Focus on Security.

Learn more about the key benefits and outcomes of our Readiness Assessments.

Contact your Connection Account Team to learn more.

]]>
Transferring Your Microsoft Enterprise... https://community.connection.com/transferring-your-microsoft-enterprise-agreement-to-csp/ May 20, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/05/3046173-Microsoft-EA-to-CSP-Communication-BLOG.png

April 2025

Announcement

Microsoft has introduced a new process allowing partners to transfer online services from an Enterprise Agreement to CSP at renewal. Simply approve Connection as your CSP reseller, and we’ll access eligible licenses for transfer. We’ll review your renewal needs, schedule into CSP, and the CSP will activate once the EA expires.

Importantly, you'll retain your entitlement to Teams without separate purchases! This new transfer moves your suites with Teams to CSP automatically.

Connection, a direct CSP partner, offers solutions and services through our CSP+ program, including:

  • Cloud break-fix support
  • Entitlement and renewal reviews by our Microsoft Consultant Team
  • Azure Advisor sessions with our Cloud Solution Architects
  • Complete lifecycle management from onboarding through renewal
  • Discounted workshops, assessments, and professional services
  • Managed Services

Contact your Connection Account Team for more information.

]]>
The Reality of Caregiving in the Age of... https://community.connection.com/the-reality-of-caregiving-in-the-age-of-technology/ May 16, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/05/3015621-Caregiver-BLOG.jpg

I was still at HIMSS when I got a call that earlier in the week, my uncle, my mom’s youngest brother, had suffered a stroke. My uncle, who has neither a cell phone, cable television, Internet, nor an email address, finally directed a neighbor through the unlocked side door of his home to a sticky-note affixed to his kitchen cabinet. “Neece,” it read.

I’ve been with him since, navigating the complexity of our healthcare system as caregiver to someone who doesn’t especially want my help and who can, in annoying turns, advocate for his needs in a voice several decibels louder than polite conversation would seemingly dictate. I make a mental note to have his hearing checked while I’m still in Teeny-tiny-town, Nebraska (t3N). 

I secure medical and financial power of attorney, not to speak for him, but to navigate the technology that is necessary to facilitate his care, technology that as Director of Healthcare at Connection I not only know, but champion with our partners and our clients. I feel like I’m now experiencing a kind of karmic payback for every piece of technology I’ve used or sold over the last 30 years.

Internet

Not every community in the U.S. has access to broadband. It’s one of the reasons why programs like The Community Connect Grants, administered by the USDA Rural Development, are so critical to preserve. Thankfully my uncle’s community is covered by 5G, and after considerable handwringing on my part, I dropped in a T-Mobile 5G router so that I could work remotely.

“They’re listening,” my uncle protests as I set the router in the window of his tidy cottage, paranoia a side effect from the stroke or years of relative isolation. “Maybe,” I offer as a concession. “But they’ll probably get bored and move on.” Intuitively, I know better than to shame him for thinking that some deep-state cabal is listening to our conversations through the router.

Medicare

Though my uncle was eligible for Medicare two years ago, he was unable to file because he couldn’t get an appointment at the local-ish Social Security office and lacked the technical skills or tools to file online. To preserve his autonomy, I spent the first week in t3N placing call after call to both the local and national Social Security 800 numbers in an effort to get a local appointment for my uncle. In more than 20 calls placed over the first five days in town, I once waited 293 minutes only to be disconnected by their phone system. 

Finally, I drove him to the local-ish Social Security office and was unceremoniously thrown out by a security guard. 

“You can’t be in here,” he admonished, thrusting a torn sheet of paper with a QR code on it into my uncle’s hand. 

My uncle started, “I don’t have a cell phone—”

The security guard cut him off. “I don’t care. You can’t be in here without an appointment.” 

“He can’t get an appointment,” I began. 

“I know,” replied the security guard. “You can’t be in here.”

The pending in-person identity checks for new and existing Social Security recipients would be an unrealistic expectation under the current structure and staffing for field offices like this one. 

Healthcare

The bright spot in t3N is the variety and high quality of the care delivered effectively by some of the most compassionate providers I’ve met as both a technologist and as a patient. t3N seems untouched by the crises facing healthcare: bed shortages, beleaguered staff, and inadequate staffing. I was able to secure follow up appointments with primary care, specialists in a variety of disciplines, and social workers who brought additional services to my uncle.

Ambient Dictation

At one visit, our nurse informed my uncle and me that she would be using ambient technology, and though I’d like to describe the “how” here, I’m sensitive to the fact that revealing the “how” potentially reveals the system and the community where my uncle is receiving care. I owe both their privacy.

I can share that while the ambient technology was engaged, the caregiver was still behind a 27" curved display, completely removed from our line of sight. While she looked at her screen, and the ambient technology recorded my uncle’s case history, there were moments of repetition— suggesting that the technology was listening, but the nurse wasn’t. 

This repetition is no different from any patient/caregiver interaction; there’s a fair amount of that anyway, but the lens changes when ambient listening is introduced. As technologists, we’re telling our clinicians that AI mitigates the charting burden, and we’re expecting them to be more engaged in the appointment. If the technology is supposed to get the clinician out from behind the computer, it fails so far on that point—but it may be more successful in the dashboards and metrics available over multiple encounters with my uncle and over the broader population. 

We’re still very early in my uncle’s patient journey and early in the use of ambient technology as well.

Patient Portal

Since I’ve only ever been responsible for my own care, having access to another person’s medical records is a new experience for me—one that as a caregiver, I need in order to give my uncle the best opportunity to speak for himself. 

I’ve been impressed by the speed at which tests, lab results, medication recommendations, and direct communication between his extended care team enter the patient portal, which enables my uncle to behave independently when I’m here and maintain compliance when I’m not. This is where healthcare digital transformation is at its most powerful.

He is still making calls from his landline, and his care team leaves messages on his Radio Shack answering machine, but on either side of those instances is the portal and the power of having access to his care team in near real-time.

Healthcare Still Needs to Be More Inclusive

The experience of caring for my uncle on the heels of having seen cutting-edge advancement in artificial intelligence in healthcare at HIMSS25 in Las Vegas was a humbling one. Our industry can’t lose sight of the estimated 42 million Americans who lack access to broadband. Although it’s estimated that only 600,000 people in the U.S. don’t have a cell phone (my uncle proudly among them), it’s more difficult to understand the numbers of people who still lack technology literacy either by circumstance or by choice. In my uncle’s wide circle of friends, he’s not the only one living as a Luddite. Digital natives are the fastest growing part of our population, but as healthcare technologists, we should build systems that care for everyone.For more information about the latest healthcare technology developments, be sure to visit our healthcare technology solutions page

]]>
Securing Retail in 2025: Top Cybersecurity... https://community.connection.com/securing-retail-in-2025-top-cybersecurity-threats-and-how-to-prepare/ May 15, 2025 Brian Gallagher https://community.connection.com/author/brian-gallagher/ https://community.connection.com/wp-content/uploads/2025/05/3041571-SecuringRetailin2025-BLOG.png

Retailers continue to face mounting cybersecurity challenges, with bad actors evolving and exploiting new vulnerabilities faster than ever. As retail organizations expand their digital footprints—from eCommerce platforms to IoT and expanded mobility devices—their exposure to cyber risk grows. Based on the latest data and observations, it’s clear that cybersecurity can no longer be a secondary priority. It must be woven into every aspect of the retail experience—from frontline employee training to backend systems and supplier networks.

The Most Common Cybersecurity Threats in Retail

Cybercriminals are constantly adapting their tactics, and retailers remain prime targets. Here are the top cyberattack methods facing the industry in 2025:

  • Credential Phishing: 58%1
    Phishing continues to dominate, with attackers crafting increasingly convincing messages to steal login credentials from retail employees and vendors.
  • Malware: 21.74%1
    Malware remains a steady threat, often used to gain persistent access to retail systems or to harvest payment and customer data.
  • Ransomware: 13.04%1
    Ransomware attacks can bring retail operations to a halt. Threat actors demand payment in exchange for encrypted business-critical data, causing costly downtime.
  • Distributed Denial of Service (DDoS): 10.14%1
    DDoS attacks aim to overwhelm retail networks and eCommerce platforms, especially during peak shopping seasons.
  • Other Attack Methods: 24.65%1
    These include insider threats, social engineering, and third-party vulnerabilities, all of which pose serious risks.

Retail: Still a Top Target

The retail sector remains one of the most targeted industries for cyberattacks. In fact, about a quarter of all cybercrimes are aimed at retailers.2 Retailers often store sensitive customer data and operate complex, distributed systems that may be difficult to secure consistently. From in-store POS systems to mobile apps and online portals, attackers are looking for the weakest link.

The High Cost of a Breach

A single breach can have lasting financial and reputational damage. While the immediate costs of a data breach are steep, the downstream effects can be even more damaging:

  • 53% of retailers report reputational damage following a cyberattack, often resulting in loss of customer trust and declining revenue.3
  • 33% of retailers faced regulatory fines due to non-compliance or failure to protect sensitive data.3

Cybersecurity is no longer just an IT concern—it’s a business risk with legal and financial implications.

Human Error Remains a Key Risk Factor

Despite advances in cybersecurity technology, human error continues to be a leading cause of breaches:

  • 78% of temporary retail employees hired in Q4 of 2024 did not receive social engineering training, leaving them vulnerable to phishing and impersonation tactics.3

This highlights the importance of employee training and awareness programs. Even the most advanced security solutions can be undermined by a single click on a malicious link.

Building a Resilient Future

Cybersecurity in retail requires a proactive, layered approach:

  • Implement advanced threat detection and response solutions
  • Conduct regular security assessments and penetration testing
  • Prioritize employee education and role-specific training
  • Ensure third-party vendors adhere to strict cybersecurity standards
  • Invest in data encryption and zero trust access policies

As retailers prepare for future growth and transformation, building cyber resilience must be a top strategic priority. By staying informed and investing in robust cybersecurity practices, retail organizations can protect both their brand and their customers.

]]>
Future-proofing IT Operations with AIOps:... https://community.connection.com/future-proofing-it-operations-with-aiops-three-things-to-know/ May 06, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/05/3033871-GTM-MIM-HPE-AIOps-BLOG.jpg

The word is out: Artificial Intelligence Operations (AIOps) is transforming IT operations, bringing automation, efficiency, and predictive insights to the forefront. An AIOps platform can empower you to observe your infrastructure continuously for potential disruptions, predict problems before they occur, and help ensure you do not face the same problem in another IT environment.

In a recent podcast titled Future-proofing IT Operations with AIOps and Automation, Cameron Bulanda, VP of Technical Sales and Centers of Excellence at Connection, and Taruna Gandhi, Head of Product Marketing for HPE OpsRamp, discussed how AIOps isn’t just about IT efficiency—it’s about unlocking innovation, reducing risk, and positioning organizations for the future.

Sound enticing? Here are three things you need to know.

  1. Observability is key for efficient IT infrastructure management. It’s hard to get to where you want to go, or where you need to be, unless you know where you’re starting from, noted Bulanda. This is especially important, Gandhi agreed, because the world of IT operations is heterogeneous and hybrid. “You can’t measure or optimize something you can’t see. Observability is the starting point for modernizing IT operations—AIOps provides a single, unified view of a distributed and hybrid IT estate.”

    Given that businesses are relying on multicloud and hybrid infrastructures, gaining full-stack visibility is critical. Taruna Gandhi notes that “most organizations we talk to have multiple data centers across the globe, plus at least two public clouds. How do you know what you own? How well it’s performing? Where the hot spots are? This is where observability becomes the game-changer.”

  2. AIOps breaks down traditional IT silos. For too long, silos have been the norm, observed Bulanda. “Cloud, storage, security—each team operates within its own lane. But this approach often slows down resolution time and creates inefficiencies.”

    HPE’s solution? “We recognized early on that IT environments would be hybrid and heterogeneous, so we designed our AIOps to provide a single control plane across the entire infrastructure, from bare metal to virtualization to containers, on any cloud or on-premises,” noted Gandhi.

    AIOps integrates these views—so teams can see how a business service, which depends on multiple components, is actually performing. This helps break down silos between team members, enabling them to collaborate in real-time.

  3. AIOps is your ticket to proactive problem-solving. AI-driven IT infrastructure management isn’t just about fixing what’s broken—it’s about preventing issues before they happen. Gandhi explained, “If an application is running out of memory, AIOps can see the trend and allocate resources before performance suffers. If capacity is projected to run out in three months, IT can get ahead of the demand instead of reacting to a crisis.”

    This proactive approach is further enhanced by HPE GreenLake, which offers customers a cloud-like experience for IT operations management. “Everything is available in one platform with single sign-on,” Gandhi said. “Customers can expand usage, explore additional services, and manage their IT environment more efficiently.”

The speakers noted AIOps is only scratching the surface of what’s possible. “We’re moving toward agentic AI, where automation is fully integrated with AI insights,” Gandhi said. “Today, we already automate routine tasks like patch management and configuration. The next step is AI-driven, policy-based automation, where the system resolves issues autonomously.”

Choosing the Right Partner for AI-driven IT Management

For organizations considering AIOps, Gandhi advised: “Start with business objectives. What outcomes do you want? Ensure your observability data is complete—no blind spots. Look for a partner with a strong ecosystem and proven expertise.”

Technology is only part of the equation, noted Bulanda. The right partner ensures long-term success by supporting your growth, integrating with your existing stack, and helping you navigate change.

Take a deeper dive into this topic:

]]>
Unlocking the AI PC Advantage: What Every... https://community.connection.com/unlocking-the-ai-pc-advantage-what-every-business-needs-to-know/ Apr 29, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/04/3007406-Lenovo-X-Intel-Post-Nuture-BLOG.jpg

In today’s rapidly evolving digital landscape, businesses are constantly seeking ways to enhance efficiency, security, and personalization. Our recent webinar, Unlocking the AI PC Advantage: What Every Business Needs to Know, provided valuable insights into how AI-enabled PCs can revolutionize business operations. Hosted by Kelly Malone, the webinar featured industry experts—Maryann Wrenn from Lenovo, Eric Chong from Intel®, and Chi Chung from Connection—who shared their perspectives on the transformative power of AI in personal computing.

The Evolution of AI-enabled PCs

AI-enabled devices have become a pivotal shift for businesses, offering capabilities that were previously unattainable. The integration of AI into IT infrastructures can significantly enhance productivity, streamline workflows, and improve security. As Maryann Wrenn highlighted, Lenovo’s innovation story is driven by feedback from customers, ensuring that their solutions meet the evolving needs of businesses.

Key Benefits of Lenovo and Intel® AI PCs

  1. Enhanced Efficiency and Productivity: AI-enabled PCs powered by Intel® Core™ Ultra processors offer better power efficiency and optimized battery life. The integration of Neural Processing Units (NPU) allows for continuous low-power AI tasks, such as virtual backgrounds and noise suppression, freeing up CPU cycles for other critical tasks.
  2. Improved Security: AI PCs keep data processing local, reducing the need to send information to the cloud and back. This localized processing enhances data privacy and security, making it ideal for businesses concerned about sensitive information.
  3. Streamlined User Experience: AI-powered features, such as real-time transcription and translation, improve collaboration and communication within Microsoft Teams. The ability to run AI tasks locally ensures a seamless and efficient user experience.

The webinar also delved into various real-world applications of an AI PC. For instance, AI is being used in healthcare to prioritize medical scans, in education to enhance student writing, and in business-specific solutions like fraud detection. The adoption of AI PCs is expected to become the standard rather than the exception, with businesses leveraging AI to stay ahead of the competition.

Starting Your AI PC Journey with Lenovo, Intel®, and Connection

For businesses interested in starting their AI PC journey, Lenovo and Intel® offer a range of innovative solutions designed to futureproof workplaces. Lenovo’s AI PCs, powered by Intel® Core™ Ultra processors, provide personalized solutions that streamline workflows, optimize performance, and enhance user experience. Connection’s expertise in AI consulting, design, and deployment ensures a smooth transition and ongoing support for businesses adopting AI technology. By partnering with Connection, businesses can harness the power of AI and Lenovo AI PCs to enhance efficiency, security, and personalization. Visit our Lenovo and Intel® showcase page today to learn more!

©Intel Corporation.

]]>
The Perfect Model of the World Is Here https://community.connection.com/the-perfect-model-of-the-world-is-here/ Apr 24, 2025 James Rust https://community.connection.com/author/james-rust/ https://community.connection.com/wp-content/uploads/2025/04/3019149-Blog-Omniverse-Perfect-Model-BLOG-1.jpg

For my senior design project in engineering school, we were asked to look at a new factory for a large manufacturer that was having trouble getting product out the door on time. They were completely surprised by the issue since they had directly copied their layout and modus operandi from an existing factory and were at a loss as to what the problem could be.

What AutoCAD didn’t tell them was if you effectively copy and paste the layout of a 100,000 square foot facility into a factory more than double that size, it’s completely normal to have some additional transport time for every single process. While AutoCAD provided a visual representation, it lacked the simulation capabilities necessary to predict and mitigate these logistical challenges. However, there is software existing today that would have prevented that issue from ever taking place.

Simulations Save You Time and Money

NVIDIA has developed OpenUSD-based software called Omniverse that can create highly accurate simulations of our world. This software facilitates precise 3D representation of objects and facilities and—coupled with the ability to replicate operational physics—provides engineers and designers with critical insights into facility performance.

BMW has utilized this software to create a digital twin of their new factory prior to construction, allowing for a solid look at their operation before it begins. This strategy permits the early detection and mitigation of potential process issues stemming from the factory layout, avoiding expensive physical alterations.

This approach also allows them to discern cost-saving measures by simulating both the facility and manufacturing performance in terms of energy use, allowing the designers to make changes to improve energy efficiency up to 35%. Operations can also be streamlined with a focus on cycle time and material flow, guaranteeing a lean operation.

The Best Design and the Best Testing Grounds

Automotive manufacturers are leveraging custom Omniverse-based software to achieve unparalleled vehicle design accuracy. This software can interface with multiple design programs, allowing for real-time collaboration between designers, engineers, and stylists, significantly accelerating the design process. The sophisticated simulation environment allows for rigorous testing, identifying design flaws early, and reducing reliance on physical prototypes. Once the model is complete, the designers can take advantage of the advanced physics in Omniverse to test drive their vehicle and solve any issues that might arise before production begins. Moreover, these simulations provide data that has proven invaluable for training autonomous vehicles, and by replicating complex and hazardous situations in the system, they can ensure the highest level of vehicle safety.

Amazon is using an application built on Omniverse called NVIDIA Isaac Sim to test their new robots in the digital world before deploying them in the real world. By constructing intricate digital twins of their warehouses, Amazon engineers can simulate and refine robotic workflows before utilizing them in the field. Furthermore, this enables them to generate synthetic data that allows for training robots for complex tasks, similar to the automotive industry application for autonomous vehicles. Since this is all digital, robotics can be tested in an extremely wide variety of situations, objectives, and events you might not otherwise be able to test in the physical world, enabling capabilities that wouldn’t be possible with traditional testing. Once trained, these models can be tested before implementation to see how they work in tandem with human workers, minimizing errors and allowing for easier adoption.

From Daunting to Doable

It’s completely understandable to look at the capabilities of this tool and imagine how long and difficult it will be to model every aspect of our physical world. Spending time running simple simulations with preexisting assets on my own was enough to make me think this was simply too big, but then I realized this isn’t meant to be a tool that is used by a single person. It’s a collaborative tool that an entire company with many different roles can use to work together more effectively and save staggering amounts of resources. Digital twins can significantly improve margin and prevent unforeseen issues with large capital investments, so it is certainly a capability worth pursuing for many industries. Naturally, this is an endeavor that is going to be long and arduous. Like any modeling software, it can only be as good as the data that is fed into it.

However, it’s not a challenge you need to tackle alone or even with your team. Connection has endeavored to gain the knowledge and resources to make navigating the Omniverse platform far easier for our customers. If you’re looking into this exciting new technology, don’t hesitate to contact the CNXN Helix Team today. We’re here to ensure that you’ll get the best results as quickly as possible as you delve into new technologies.

To learn more about AI solutions built for manufacturing, visit us our website today.

]]>
Transforming Clinical Communication with... https://community.connection.com/transforming-clinical-communication-with-technology/ Apr 22, 2025 Kelly Kempf https://community.connection.com/author/kelly-kempf/ https://community.connection.com/wp-content/uploads/2025/04/3011371-Apple-in-Healthcare-BLOG.jpg

Clinical communication is vital to the delivery of safe, high-quality patient care. Effective and timely communication among everyone managing patient information is essential to optimally perform tasks that support patient care. Providers use tablets, smartphones, and other mobile devices for secure messaging, video, voice, and critical alerts. Ideally, this communication would be on a single platform with fewer devices.

Today’s successful healthcare institutions view their mobile platforms as mission critical. Choosing the right platform is an essential enterprise decision directly affecting the quality of patient outcomes. Investments made by technology partners, hardware, and software alike have significantly advanced healthcare developments in this arena. The commitment to enhancing clinical communication is invaluable, improving experiences for both clinicians and patients. Here we highlight Apple’s contributions.

Apple’s Clinical Communication Ecosystem 

In recent years, Apple has made significant strides in the healthcare industry, leveraging its powerful ecosystem to revolutionize clinical communications. Today, a single Apple device equipped with third-party iOS or iPadOS applications (such as Electronic Medical Record integrated applications) enables clinicians to manage alerts, coordinate care, document patient observations, and complete interventions. Additionally, Apple enhances home health operations by facilitating two-way communication between patients and their care teams beyond healthcare facilities. iOS is packed with amazing healthcare features that offer institutions powerful ways for clinicians to stay connected to patients, provide useful information, keep data secure, and protect patient privacy—ultimately improving patient outcomes.

Empowering Clinicians 

Apple’s devices—including the iPhone, iPad, and iMac—have become indispensable tools for healthcare professionals. By utilizing healthcare applications via an iPhone or iPad, clinicians benefit from secure, integrated communication capabilities on a multipurpose device. These devices enable seamless patient care, whether at the bedside or remotely. Clinicians can access health records and data in real-time, ensuring they have the information they need to make informed decisions. Workflows are improved by providing tools that can conduct barcode validation, ensure accurate medication administration, facilitate specimen collection, capture clinical images, perform bedside ultrasounds, and even estimate blood loss post-surgery or labor and delivery. The intuitive interface and powerful hardware of Apple devices streamline workflows, allowing healthcare providers to focus on what matters most—their patients.

Enhancing Patient Engagement 

Apple’s technology also plays a crucial role in keeping patients connected to their care teams with their ecosystem of healthcare products being described as bold and ground-breaking. With apps on iOS and iPadOS, patients can manage their health, track their progress, and communicate with their healthcare providers between office visits. The Health app and HealthKit-enabled devices simplify the process for patients to record and share their health data, fostering a more personalized and proactive approach to healthcare. 

Innovating Medical Research 

Apple’s commitment to healthcare extends to medical research as well. The ResearchKit and CareKit frameworks enable researchers to create custom applications for their studies, making it easier to enroll participants, capture informed consent, and gather data more frequently. This collaboration with the medical community pushes the boundaries of what is possible in health research, ultimately leading to better patient outcomes. 

Security and Privacy 

In the healthcare industry, security and privacy are paramount. Apple has built its devices with robust security features, including encryption, secure authentication, and device management tools tailored for healthcare organizations. This ensures patient data is always protected, giving both healthcare providers and patients peace of mind. 

Investing in Clinical Communication 

Innovative technology is transforming clinical communications in the healthcare industry, making it more efficient, personalized, and secure, creating a future where healthcare is more human and connected. When leveraging the best devices, whichever platform you choose, you are investing in the potential for future innovation across multiple use cases. From empowering clinicians with cutting-edge technology to enhancing patient engagement, improving communication is at the forefront of healthcare innovation and improving patient care.

The modern healthcare environment demands innovative approaches to mitigate expenses, eliminate inefficiencies, alleviate provider fatigue, and boost patient outcomes. Connection delivers all-encompassing solutions tailored for healthcare organizations, aiming to streamline workflows, safeguard sensitive information, and enhance the experiences of patients and healthcare providers alike. Visit our dedicated healthcare page for more information.

]]>
How Contract-to-hire Engagement Boosts... https://community.connection.com/how-contract-to-hire-engagement-boosts-employee-retention-in-it/ Apr 15, 2025 Patrick Dja Konan https://community.connection.com/author/patrick-dja-konan/ https://community.connection.com/wp-content/uploads/2025/04/3007371-Contract-to-hire-Staffing-BLOG-POST.jpg

Organizations heavily depend on skilled IT professionals to lead digital transformation in the fast-paced tech world. However, finding the right IT talent can be a challenging task, especially with the risk of poor hires and high turnover rates. To address these challenges, many companies are choosing contract-to-hire staffing options to not only find the right fit for their teams, but also minimize the risks associated with hiring.

Understanding Contract-to-hire Staffing
Contract-to-hire staffing allows companies to hire IT professionals temporarily, with the option to make them permanent employees later. This option helps businesses assess candidates’ abilities, work habits, and fit within the company culture before committing long term. With the high demand for IT talent and an average tenure of less than two years, this staffing solution has become very popular.

Why Companies Choose Contract-to-hire
Minimized Risk of Poor Hires:
 A Harvard Business Review study found that 80% of employee turnover is due to bad hiring decisions. By implementing a contract-to-hire approach, organizations can assess candidates’ performance in practical settings, thereby significantly diminishing the risk associated with hiring unsuitable employees.

Lower Turnover Rates: Employee turnover is about 18% annually, with 60% to 70% being voluntaryAccording to the Work Institute, replacing an employee can cost up to 33% of their annual salary. Contract-to-hire options help ensure new hires fit well, leading to higher job satisfaction and lower turnover rates.

Flexibility in Staffing: The technology sector frequently experiences changes and varying project requirements. Contract-to-hire staffing allows companies to adjust their workforce according to project demands. This flexibility is essential for maintaining operational efficiency and adhering to deadlines.

Effectiveness of Contract-to-hire

Retention Rates: Data from the Society for Human Resource Management (SHRM) reveals that 74% of employers using contract-to-hire models report reduced turnover rates within the first year of employment. This statistic highlights the significance of the evaluation period in determining a suitable fit.

Time to Hire: Considering the average hiring time for IT positions is approximately 60 days, contract-to-hire options enable companies to promptly fill critical positions without compromising on quality.

Quality of Hire: Contract-to-hire staffing leads to a 15% increase in employee performance compared to direct hires, according to the National Bureau of Economic Research. This is due to the thorough vetting during the contract period.

In a competitive job market where IT talent is in high demand, companies should implement staffing strategies that reduce risks and increase efficiency. This approach allows companies to access a wider talent pool, ensuring they find candidates with the appropriate skills and experience, and providing them with the flexibility to minimize the impact on their teams and resources.

As an IT solutions provider, Connection assists companies in hiring IT talent for contract, contract-to-hire, and full-time positions. We offer a 90-working day contract-to-hire program with no conversion fee, providing organizations a period of 90 working days to evaluate prospective employees before making a permanent hiring decision. Contact us today to learn more.

]]>
CNXN Helix Wins Top Honors in U.S. Naval AI... https://community.connection.com/cnxn-helix-wins-top-honors-in-u-s-naval-ai-challenge/ Apr 08, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/04/2993421-CNXN-Helix-Navy-Win-BLOG.jpg

The Naval Information Warfare Center (NIWC) Atlantic Palmetto Tech Bridge recently hosted a prize challenge to develop generative AI and machine learning solutions that will enable Navy users to access secure AI capabilities in a practical, real-world manner. We’re excited to share that the CNXN Helix Center for Applied AI and Robotics took first place out of a tough field of competitors!

The contest, titled the “Secure Commercially-Based AI Environment for Real-World Naval Applications Challenge,” was issued by NIWC Atlantic Palmetto Tech Bridge, in collaboration with the Program Executive Office for Manpower, Logistics and Business Solutions.

The winning solution from CNXN Helix™ proposed an air-gapped AI platform that supports large AI models while empowering users with no-code and low-code interfaces—offering the Navy a flexible, scalable solution for mission-critical operations. Adhering to a complex and rigorous set of standards, the concept features both custom and commercially available components that utilize industry-standard best practices to ensure data security, isolation, and confidential computing at every layer. Presenting to the panel of judges, the CNXN Helix team demonstrated a secure, pre-configured environment that supports both experimentation and operational use by non-AI/ML experts. The team was also able to highlight several use cases in which CNXN Helix successfully deployed scalable, flexible versions of the platform within the defense environment. See what the experts from CNXN Helix had to say about the prize challenge and the proposed solution in our video interview with the team.

https://vimeo.com/1051938195/4e5c255c01?share=copy

About the Prize Challenge

The Secure Commercially-Based AI Environment for Real-World Naval Applications Challenge is part of the U.S. Government’s efforts to recognize outstanding achievements in basic, advanced, and applied research, technology development, and prototype development.

About CNXN Helix

The CNXN Helix Center for Applied AI and Robotics, a division of Connection, specializes in accelerating AI-driven solutions to fuel growth. From conceptualization to deployment, CNXN Helix streamlines the development of innovative AI applications for enhanced efficiency and profitability. To learn more about CNXN Helix and the team’s award-winning AI solutions, visit www.cnxnhelix.com

]]>
Mastering Data Orchestration: Turning Chaos... https://community.connection.com/mastering-data-orchestration-turning-chaos-into-control/ Apr 03, 2025 Fausto Dedeschi https://community.connection.com/author/fausto-dedeschi/ https://community.connection.com/wp-content/uploads/2025/04/2986821-AI-Data-Orchestration-BLOG.jpg

It’s 2:00 a.m. Your phone buzzes. Your data pipelines failed—again. A critical dashboard is missing half its numbers, and the CFO is asking why last night’s ETL job didn’t run. You check your data orchestration tools, but the logs are a mess. Is it a dependency issue? A failed API call?

If that sounds familiar, you’re not alone. Managing data orchestration can feel like drinking from a firehose. IT teams struggle to debug workflows that break for no apparentreason.

Here’s the good news: This guide will provide key steps to evaluate/select a data orchestration solution.

What Is Data Orchestration?

Data orchestration is a tool that automates data workflows across multiple systems. It runs data pipelines efficiently by handling dependencies and providing governance. Unlike simple schedulers, orchestration tools adapt dynamically to uphold data integrity across complex environments.

Data orchestration automates and coordinates the execution of these pipelines, ensuring tasks run in the correct sequence with the proper dependencies. Data engineering involves designing, building, and maintaining data pipelines that move, transform, and store data.

Why schedulers are not enough:

Traditional schedulers like cron, Airflow, and ActiveBatch trigger tasks at set intervals. But they lack awareness of dependencies or failures. By contrast, data orchestration tools:

  • Manage dependencies: Run jobs in the correct order.
  • Handle failures: Process retries, alerts, and logging.
  • Scale dynamically: Adjust to workload changes automatically.

The Most Common Data Orchestration Challenges (And How to Avoid Them)

Meet Bob, Senior Data Engineer at a mid-sized software firm. He used to dread deployment days. Pipelines would fail unpredictably, and debugging turned into an all-night marathon. But after his team implemented a structured data orchestration strategy, things just work.

Want to do the same? You can, but there are a few hurdles to clear first. Let’s break down the most common pitfalls and how to avoid them.

1. Scaling Challenges: Managing Hundreds of Jobs Efficiently

As data pipelines grow, managing hundreds—or even thousands—of interdependent tasks gets overwhelming. Without the right data orchestration tools, teams face workflow failures. A well-architected system should support modular design and automated retries. It should also offer distributed execution for scalability.

2. Breaking Down Data Silos for Seamless Integration

Many organizations struggle with data silos. These can trap information in isolated systems. Effective data integration unifies data.

3. Improving Data Quality Through Automated Validation

Poor data quality can break your reports and skew your insights. Automated validation rules and anomaly detection help identify and fix issues—before they impact downstream processes. High quality data improves decision-making and boosts overall pipeline reliability.

4. Strengthening Data Governance for Compliance and Security

Lack of data governance creates compliance risks and security vulnerabilities. Orchestrating workflows with built-in audit trails and access controls keeps data secure. Use policy enforcement to meet regulatory standards like GDPR and HIPAA.

5. Empowering Data Engineers with the Right Tools

Data engineers need flexible, developer-friendly data orchestration tools that support cloud-native, containerized architectures. Solutions like Prefect and Dagster provide dynamic scheduling and declarative configurations. These cloud-based tools cut down on manual intervention and improve workflow efficiency.

Choosing the Right Data Orchestration Tools (Without Losing Your Sanity)

Selecting a data orchestration tool can feel like debugging a failing pipeline in production—frustrating and high stakes. Every tool claims to be the best, but each has trade-offs. Without the right fit, you’ll spend more time fighting data silos than gaining insights.

Decision Matrix: Which Data Orchestration Tool Is Right for You?

Apache AirflowDagsterPrefectMage
Best For:Large-scale, enterpriseModular workflowsCI/CD and testingLightweight ML pipelines
Error Handling:Manual retriesAutomaticAutomaticAutomatic
Managed Services?AWS MWAA, AstronomerNoPrefect CloudNo
Real-time Triggers?LimitedYesYesYes
Ease of Setup:ModerateSteepEasyEasiest

Let’s zoom in on a few of those data orchestration tools.

Apache Airflow: The Enterprise Workhorse

Airflow dominates large-scale data integration with its robust scheduling and dependency management. But there’s a learning curve. Debugging Directed Acyclic Graph (DAGs) can be painful, and scaling requires serious infrastructure work. It’s best for teams needing full control and willing to invest in maintenance.

Dagster: The Modular Powerhouse

Dagster treats orchestration like data quality engineering, focusing on observability and testing. Unlike Airflow, it enforces best practices by design. However, its steep learning curve makes adoption tough. Dagster is ideal for teams prioritizing structured workflows and long-term maintainability.

Prefect: The CI/CD-friendly Option

Prefect shines in data governance, with a lightweight API and automatic retries. Its cloud service removes infrastructure headaches, but rapid updates sometimes cause breaking changes. It’s great for teams needing flexibility without dealing with Airflow’s complexity.

Mage: The Lightweight ML Choice

Mage is built for data analysis tools and machine learning pipelines, offering an easy-to-use, Pythonic approach. It lacks enterprise-scale features but is perfect for small teams or rapid prototyping, where simplicity and speed matter more than customization.

When to Choose a Managed Data Orchestration Platform

Are you better off self-hosting your data orchestration platform or using a managed service? If you go self-hosted, will you spend more time maintaining infrastructure than improving customer data pipelines? But if you choose managed, won’t you lose flexibility and risk vendor lock-in? Which one will scale efficiently without draining your team’s resources?

Here’s a breakdown to help you choose:

Self-Hosting: Full Control, Higher Maintenance

✅ Best for teams with strong data engineers and DevOps resources.
✅ More flexibility to customize, optimize, and control security.
✅ Avoids vendor lock-in and ongoing cloud service costs.
❌ Requires managing infrastructure, scaling, and troubleshooting.
❌ More time spent on maintenance instead of improving data pipelines.

❌ Hidden costs include DevOps overhead and monitoring expenses.

 

Managed Orchestration: Less Overhead, More Convenience

✅ No need to manage servers—AWS MWAA, GCP Composer, and Astronomer handle scaling and updates.
✅ Faster setup and integration with cloud-native data analysis tools.
✅ Built-in monitoring, security, and failover reduce operational risks.
❌ Higher cost, especially at scale.
❌ Less flexibility and reliance on a third-party provider.

In a nutshell, if you have the resources, self-hosting works. If not, managed orchestration frees up engineering time for higher-value work.

Popular Managed Data Orchestration Services

For teams looking to offload infrastructure management, these managed data orchestration services provide automation and scaling:

  • AWS Managed Workflows for Apache Airflow (MWAA): A fully managed service that makes it easy to run Apache Airflow on AWS. It can handle scaling and patching. It integrates with other AWS services.
  • Google Cloud Composer: A managed Apache Airflow service that can help you create and manage workflows. It integrates into Google Cloud.
  • Astronomer: A commercial platform offering managed Apache Airflow with high-level observability. Its security and enterprise support simplify workflow management.
  • Azure Data Factory: A cloud-based data integration service. Data Factory lets you build data-driven workflows for orchestrating data movement.
  • IBM DataStage: A data integration tool that supports the development and running of jobs that move and transform data. It’s available as a fully managed service on IBM Cloud Pak for Data.
  • Prefect Cloud: A managed workflow orchestration tool. It provides automatic retries, logging, and dynamic task scheduling.

CI/CD for Data Pipelines: Stop Debugging in Production
You push your latest data pipeline update, grab a coffee, and check your messages—no alerts, no failures. Your data orchestration process runs smoothly, and reports are generated on time. No last-minute rollbacks, no CFO breathing down your neck. Why? Because your data team finally implemented proper CI/CD testing.

Why Testing Data Pipelines Is Essential
Without automated testing, data governance breaks down fast. A small schema change can corrupt reports, siloed data can reappear, and an unnoticed failure can cascade across workflows. CI/CD catches issues before they hit production, saving time, money, and sanity.

How to Version Control Workflows with Git
Data engineers should treat data pipelines like software—version control everything. Use Git to track pipeline changes, enforce code reviews, and roll back safely. Best practices:

  • Store pipeline definitions (DAGs, scripts) in Git.
  • Use feature branches for changes, with automated testing on commit.
  • Implement pull request approvals before merging to main.

Setting Up Automated Testing for Airflow DAGs
Airflow makes testing painful without the right setup. Here’s the fix:

  1. Use pytest to unit test DAGs—validate task execution and dependencies.
  2. Mock external services to avoid API failures.
  3. Run DAG validation scripts before deployment.

Using Prefect’s Parameterized Flows for Better Data Quality Checks
Prefect’s dynamic parameterization helps data teams run the same flow across multiple datasets with built-in validation. Automate schema checks, data profiling, and alerts to ensure every run meets quality standards before moving downstream.

Troubleshooting: What to Do When Everything Fails (Because It Will)
Troubleshooting data pipeline failures can feel like searching for a missing transaction in a sea of scattered logs. It can make you sweat. When jobs fail unpredictably and dashboards go dark, teams waste hours chasing issues instead of delivering insights. A structured approach to debugging lets you zero in on failures before they derail your operations.

SymptomCauseFix
DAGs not triggeringMissing dependenciesCheck DAG definition and logs
Jobs failing at random stepsState inconsistencyEnsure idempotency
Partial failures corrupting dataNo rollback mechanismUse atomic transactions
Airflow scheduler crashesToo many concurrent tasksOptimize parallel execution
Logs missing crucial detailsPoor logging setupImplement structured logging

Let’s take a closer look at some important troubleshooting techniques.

Find the Root Cause Fast
Without structured data governance, even a small dependency issue can bring down an entire pipeline. First step? Check logs—if they exist. Poor logging makes debugging impossible, so implement structured logging with clear log levels (INFO, WARN, ERROR). Use standardized formats (JSON) and centralized aggregation tools like ELK or Datadog.

Prevent State Inconsistencies
Jobs that fail randomly are a sign of state inconsistency. When processing data, ensure idempotency—rerun jobs without corrupting outputs. Use atomic transactions to maintain integrity, so partial failures don’t leave your database in limbo.

Avoid Siloed Failures
Siloed data can hide errors. A single failing job in an upstream system can silently corrupt downstream reports. Data teams should enforce dependency checks at every stage, using DAG validation to prevent invisible failures.

Airflow vs. Dagster: When to Use Sensors or Schedules
Data teams using Airflow should leverage sensors for event-driven workflows—trigger jobs when upstream data is ready, instead of running on blind schedules. Dagster’s sensors handle this natively, reducing unnecessary processing and preventing wasted compute.

Real-Time vs. Batch Processing: Do You Really Need Millisecond Latency?

Picture this: Your team switched from real-time data orchestration to smart batch processing, and suddenly, everything is working. Compute costs are down, dashboards refresh without lag, and you’re not waking up at 2:00 a.m. to chase missing events. Turns out, not everything needs millisecond latency.

FactorBatch ProcessingReal-time Processing
Best ForReporting, analytics, periodic updatesFraud detection, stock trading, IoT data
Compute CostLower (processes in bulk)Higher (continuous processing)
Data ComplexityEasier to manage and debugMore dependencies, harder to maintain
Failure ImpactIsolated; can rerun jobsImmediate; can disrupt live systems
ExampleLoading daily sales into a cloud data warehouseProcessing a credit card transaction

When Real-Time is Overkill
Real-time data orchestration sounds cool, but for most workflows, it’s unnecessary. Unless you’re running fraud detection or stock trading algorithms, batch processing often delivers the same insights at a fraction of the cost.

The Hidden Cost of Streaming Everything
Pushing every event immediately increases processing overhead. It can also introduce unnecessary complexity if data sources aren’t properly integrated. Instead, use batch processing to consolidate updates and keep workflows manageable.

How to Choose the Right Approach
If your data sources update irregularly or aren’t time-sensitive, batch is your best friend. Streamlined workflows mean fewer dependencies and lower chances of failure. Use real-time only where it truly adds value.

Future-proofing Your Data Orchestration Strategy
Scaling data workflows shouldn’t mean constant firefighting. Yet, relying on a single vendor can leave your pipelines trapped in a rigid system that no longer fits your needs. Without a future-proof strategy, migrations are painful, and your growth is limited. The right orchestration approach keeps your data portable and ready for whatever comes next.

Avoid Vendor Lock-In with Open-Source Flexibility
Locking into a single vendor creates risk, especially if costs spike or the provider sunsets a feature. Open-source tools like Apache Airflow and Dagster prevent siloed data by offering flexibility, customization, and control over your infrastructure.

FeatureOpen-source (Airflow, Dagster)Managed (AWS MWAA, GCP Composer)
FlexibilityHigh—customizable and self-hostedLimited—depends on provider
CostLower (but requires maintenance)Higher (pay for ease of use)
Security ControlFull control over securityProvider manages security
ScalabilityRequires tuningAuto-scales with demand

Design Modular, Reusable Data Workflows
A well-architected pipeline isn’t a tangled mess—it’s a series of reusable components. Follow these best practices:

  • Break workflows into small, independent tasks to reduce failure points.
  • Use templates and parameterized jobs to handle multiple data sources without rewriting code.
  • Implement workflow versioning to track and rollback changes easily.

Containerization: Making Pipelines Portable
Want to load data across multiple environments without rewriting everything? Use Docker and Kubernetes to package workflows into version-controlled containers. This keeps the work consistent from dev to production.

Future-proof with a Scalable Data Warehouse
A rigid data warehouse can bottleneck growth. Choose platforms that scale and support multiple query engines. Cloud-native warehouses like Snowflake or BigQuery will let you adapt as your data needs evolve.

Conclusion: The Secret to Successful Data Orchestration Implementation

Successfully implementing Data Orchestration requires more than selecting the right tool; it demands a strategic approach that ensures scalability, reliability, and efficiency.

Actionable Next Steps

  1. Define Clear Objectives and Business Outcomes
    Why are you orchestrating? Are you solving pipeline failures, reducing cloud costs, or scaling your data operations (including AI).
  2. Choose the Right Orchestration Tool for Your Needs
    Pick a tool that matches your pipeline complexity, cloud strategy, and team skills.
  3. Build Modular and Scalable Workflows
    Design pipelines as small, reusable building blocks rather than massive workflows.
  4. Automate Failure Handling and Error Recovery
    Build self-healing pipelines that automatically recover from failures.
  5. Implement Strong Monitoring and Observability
    Know when and why a pipeline fails before it impacts business operations.
  6. Optimize for Cost and Performance Efficiency
    Design cost-efficient workflows that minimize cloud spend and processing overhead.
  7. Ensure Security and Compliance from Day One
    Security should be baked into orchestration, not an afterthought.
  8. Foster a Culture of Collaboration Between Teams
    Create joint workshops to align teams on orchestration goals.

Final Takeaway

  • Automate Everything – No manual interventions, just smooth execution.
  • Monitor Relentlessly – Know when things fail and why.
  • Optimize Smart – Reduce cost, improve efficiency, and scale with confidence.

Take Control of Your Data Orchestration with CNXN Helix

Are you facing challenges with complex data pipelines, isolated data silos, or costly inefficiencies? The CNXN Helix Center for Applied AI and Robotics helps IT leaders streamline data orchestration, optimize data processing, and eliminate bottlenecks—ensuring your workflows run smoothly, securely, and at scale.

🔹 Future-proof Your Data Strategy: Enjoy the benefits of open-source flexibility, managed solutions, and customizable workflows designed specifically for your business.
🔹 Eliminate Data Silos: Connect and unify data sources for real-time insights and better decision-making.
🔹 Maximize Performance: From data collection to storage, we fine-tune every step to eliminate inefficiencies and boost overall performance.

Ready to transform your data orchestration? Schedule a workshop today! Email AI@connection.com or call 1.888.213.0260 and ask for a Helix Pro.

]]>
AI in Supply Chains: What Works, What’s... https://community.connection.com/ai-in-supply-chains-what-works-whats-hype-and-whats-next/ Apr 02, 2025 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2025/03/2986736-AI-In-SupplyChain-BLOG.jpg

Michael, a supply chain manager at a growing logistics firm, was staring down a nightmare. Warehouses were overflowing with the wrong inventory while high-demand SKUs kept going out of stock. Then he turned to an AI demand-forecasting tool. Within months, stockouts dropped by 30%, and overstock fell by half.

AI is solving real supply chain problems—from route optimization to risk detection—but there’s a catch. It only works if you apply it to the right problems, with clean, structured data. So how can you get there?

This article shows where AI in supply chain operations actually delivers. You’ll see the use cases and best practices that are working, as well as the pitfalls to avoid. That matters, because like Taiichi Ohno said, “The right tool in the wrong place only automates waste.”

1. What Is AI in Supply Chains, Really?

AI in the supply chain is machine learning that manages logistics networks. AI systems analyze historical data and apply computer vision and demand forecasting to optimize product flow. The goal is fewer disruptions, lower costs, and better decision-making. But AI has its limits.

✅ AI excels at processing structured data to spot patterns and risks and suggest improvements.

❌ It struggles with unstructured data. It can’t replace human supply chain professionals, but it can tackle their repetitive tasks.

Here’s how AI is in use today:

AI Use CaseEffectivenessAdoption Level
Demand forecasting✅ High🔵 Widespread
Inventory optimization✅ High🔵 Widespread
Route optimization✅ High🟡 Moderate
Warehouse automation✅ High🟡 Moderate
Supplier risk assessment✅ High🟡 Moderate
Real-time tracking✅ High🔵 Widespread
Procurement automation⚠️ Mixed🔴 Low
Predictive maintenance✅ High🟡 Moderate
AI-powered chatbots⚠️ Mixed🔴 Low
Freight cost reduction✅ High🟡 Moderate
Dynamic pricing models⚠️ Mixed🔴 Low
AI-driven contract writing❌ Poor🔴 Low
Labor scheduling✅ High🟡 Moderate
Sustainability optimization⚠️ Mixed🔴 Low
AI-enhanced cybersecurity✅ High🟡 Moderate

AI in supply chains isn’t magic. But the right AI—in the right place—can be a powerful tool for cutting operational costs.

2. AI that Works: Real-world Use Cases in Supply Chain Operations

Artificial intelligence can be brilliant or problematic. The deciding factor is how you implement it. Consider Sarah, a supply chain manager in a mid-market consumer goods firm. She watches the numbers roll in—inventory levels are off, warehouses are overloaded, and demand just shifted again.

Sarah could implement AI in one of two ways. She could address a clear business need and feed it clean data. Or she could plug it into messy spreadsheets and outdated ERP systems, hoping for a quick fix. One approach delivers ROI. The other leads to bad outputs and fulfillment delays.

Here’s where AI in supply chain management is successfully delivering ROI, and where it’s failing.

Demand Forecasting: Smarter Inventory, Less Waste

Every supply chain lives or dies by demand forecasting. Get it right, and you trim waste and cut your storage costs—oh, and by the way, you keep shelves stocked. Get it wrong, and you’re stuck with overstocked warehouses or scrambling to fill empty shelves. AI can analyze historical data and stop the bullwhip by changing inventory levels in real time.

✅ Where AI Wins: Walmart is predicting shopping trends with AI forecasting. They’ve already slashed forecasting errors by 30% and saved hundreds of millions on inventory costs. Their AI tools crunch oceans of real-time data on sales, weather, and market trends. Then they adjust stock levels to match.

Where AI Fails: Small retailers can struggle to squeeze ROI from AI programs. AI algorithms need clean, structured data, but many businesses don’t have it. For decades, most have leaned too heavily on tangled spreadsheets and outdated ERP systems.

⚙️The Fix: Standardize data formats and remove duplicates. The right partner can help you clean your data and set up AI for real results.

Route Optimization: AI Logistics

In supply chain management, even small route inefficiencies add up fast. Take Titan Logistics—a mid-sized distributor struggling to meet delivery windows. A winter storm reroutes their trucks, adding hours to transit times. Fuel costs spike, shipments miss deadlines, and supply chain disruptions ripple through their logistics networks. AI route optimization promises to stanch scenarios like this. It can scan fuel prices and traffic conditions to find the most efficient route.

✅ Where AI Wins: UPS’s ORION system started without AI nearly a decade ago, saving 10 million gallons of gas per year. Today, it packs an AI punch, recalculating delivery routes dynamically, rolling in factors like customer demand and road closures.

Where AI Fails: AI can face hurdles with supply chain interruptions caused by extreme weather. A blizzard can shut down a hub, and AI models don’t always adjust fast enough. When the unexpected strikes, supply chain planners need to step in to make real-time decisions.

⚙️ The Fix: Use real-time data like traffic updates and weather forecasts. An experienced AI partner can fine-tune your tools to handle the unexpected. 

Warehouse Automation: AI Meets Robotics

A warehouse is not a museum. Products need to move, not sit collecting dust. Robots powered by machine learning models can speed up fulfillment and scrub out errors. But automation needs investment, and adoption can vary.

Where AI Wins: DHL teamed up with Robust.AI to introduce collaborative robots like Carter. These AI-driven bots help sort and transport goods. They’re driving down errors and speeding up fulfillment. To date, they’ve already made over 500 million picks.

Where AI Fails: For small businesses, the upfront cost of AI robotics can be prohibitive. Many smaller firms sink significant budgets into automation tech, only to face hidden costs like integration and training.

⚙️ The Fix: Turn to AI warehouse management software before investing in costly robotics. Use a partner to help automate workflows without going over budget.

Risk Management: AI Detects Global Supply Chain Disruptions

Modern global supply chains face constant risks, from climate change to political instability. Logistics pros are using AI to spot disruptions before they can turn into full-blown bottlenecks. But over-relying on automation can limit flexibility when the unexpected strikes.

Where AI Wins: A Fortune 500 automaker was drowning in delays and excess stock. AI stepped in with a digital twin, giving real-time visibility across suppliers. They cut inventory by 20% and saved $10M in expedite costs. They also bagged a 94.7% drop in point-of-use misses. No more flying blind.

Where AI Fails: AI still can’t replace human intuition in crisis situations. Over-reliance on AI algorithms opens the door to decision-making bottlenecks. When a black swan hits or new regulations drop, the playbook can go out the window.

⚙️ The Fix: Use AI for early warnings, but keep humans in control for crisis decisions. Build manual overrides into AI systems.

AI in Supply Chains: Big Wins, Big Potential

AI delivers huge wins for industry giants—Fortune 500 firms with deep pockets and clean data. But for mid-sized businesses, the path can be tricky. The challenges—like high costs and messy data—are real. With the right approach, mid-sized companies can tap into the same efficiencies without the headaches.

Mid-sized companies don’t need moonshot AI—they need real ROI. That’s where the CNXN Helix Center for Applied AI and Robotics comes in. We cut through the hype to deploy AI where it actually works, integrating Goldratt-level automation without disrupting your operations.

With CNXN Helix, you get:

  • Expert guidance from supply chain pros who know AI and understand your business needs
  • AI supply chain planning to right-size inventory and reduce waste
  • Demand forecasting models that adapt to market shifts in real time
  • Risk management solutions to detect disruptions before they hit
  • Process automation that enhances efficiency without overhauling your tech stack

3. AI Myths that Are Costing Supply Chain Managers Money

AI sounds like magic—feed it data, and suddenly your supply chain runs itself. But misconceptions can derail your progress. Here’s what AI myths get wrong, and how to get it right.

Myth 1: AI Can Run Your Supply Chain Automatically

AI is an assistant, not the boss. It can crunch real-time data and flag risks. But when a supplier suddenly pulls out or new regulations change the game, it can’t make calls. That’s still up to supply chain planners.

Let AI do the heavy lifting, but even the best machine learning models can’t correct mid-crisis. Just ask anyone who’s had AI suggest they route a critical shipment through a port that’s been shut down for weeks.

Myth 2: More Data = Better AI

More data isn’t always better—it’s just more. Artificial intelligence needs clean, structured data to make smart predictions. Feed it a mess of bad ERP inputs, duplicate records, and outdated spreadsheets, and you don’t get insight. You get GIGO.

Think of it like training a self-driving car on old road maps—it’s going to miss every detour. Companies that prioritize data quality—cleaning, structuring, and integrating it—see AI deliver. The rest burn budgets trying to make it work.

Myth 3: AI Will Replace Supply Chain Jobs

AI agents aren’t coming for your job. But they are changing it. AI is a force multiplier. It takes repetitive tasks off your plate and flags risks before they escalate. But at the end of the day, we still need humans to drive strategy. Like Toyota’s Just-in-Time system, it’s powerful, but only when it’s done in the right way.

4. The AI Implementation Playbook: How Supply Chain Organizations Can Get It Right

AI in supply chain management is like Christensen’s disruption theory. It can be a game changer or untapped potential. Picture this: A supply chain manager watches real-time dashboards adjust inventory levels before a shortage hits. No missed orders. Just predictable operations—and a deep breath.

Step 1: Start with a Problem, Not a Tool

It’s the old “fail to plan” mantra. A company invests in an expensive system that doesn't end up panning out. Artificial intelligence in the supply chain works best on a clear, defined problem—not vague “innovation.”

How to Get It Right:

  • Identify the biggest pain point. Is it late shipments? Overstocked warehouses? Frequent stockouts?
  • Talk to the frontline. Your supply chain planners and ops teams know where things break down. Listen.
  • Map the cost of inaction. What does this problem cost in lost revenue, wasted inventory, or delays?
  • Define a use case before choosing AI. If you can’t explain how AI fixes the issue, don’t buy it.

Solve a real problem, and AI delivers ROI. Deploy it without a plan, and you’re just burning cash.

Step 2: Clean and Structure Your Data

This is the hardest step—and the most important to get right. AI in supply chain management is only as strong as its data. If yours is messy or scattered across legacy systems, AI will just automate your bad decisions faster.

How to Get It Right:

  • Audit your data. Map where data lives—ERP, spreadsheets, emails, supplier portals. Pull reports. Look for missing fields, duplicate entries, and conflicting numbers. If two systems show different inventory levels, AI won’t know which one is right. Identify the “master” record and sync the others to it. If needed, use data reconciliation tools to flag and resolve discrepancies before you get AI involved.
  • Standardize formats. Use consistent naming conventions for SKUs, locations, and suppliers. Convert manual entries into structured fields. If one team logs “NY Warehouse” and another logs “New York DC,” AI won’t connect the dots. Enforce a single format and apply automated data validation to flag inconsistencies.
  • Unify sources. Artificial intelligence needs one source of truth. Consolidate fragmented systems by integrating ERP, WMS, and TMS. If a full integration isn’t possible, create automated data pipelines to sync critical raw data.
  • Fill the gaps. If you lack on-the-spot data, invest in RFID, barcode scanning, or IoT sensors to track inventory and shipments. If past sales are incomplete, pull market trends and industry benchmarks to supplement missing insights.

This takes work, but if your data is a mess, AI in your supply chain will be, too.

Step 3: Choose the Right AI for Your Supply Chain Needs

AI isn’t one-size-fits-all—different solutions serve different problems. Taiichi Ohno built efficiency by matching the right tools to the right tasks. The same applies here. AI in supply chain management works best when it’s tailored to specific challenges. Here’s how different AI tools stack up

AI ToolUse CaseUser Reaction
Predictive MaintenancePrevents equipment failures before they happen“Reduced downtime by 30%, but setup was complex.”
Route OptimizationFinds the most efficient route for deliveries“Saves fuel and time but struggles with real-time disruptions.”
Demand PredictionCan process vast amounts of data to predict customer demand“Big improvement in accuracy, but bad data leads to bad forecasts.”
Warehouse AutomationUses machine learning models to optimize picking and packing“Faster fulfillment, but costly for small businesses.”
AI-powered Risk ManagementIdentifies supply chain bottlenecks before they escalate"Helps with planning, but AI still misses black swan events.”

Step 4: Integrate AI Without Disrupting Supply Chain Management

AI should enhance decision-making, not derail operations. Think of Drucker’s approach to management: Integrate the new technique without breaking what already works. A supply chain organization that drops AI into workflows without a transition plan risks confusion, inefficiencies, and employee pushback.

How to Get It Right:

  • Start small. Pilot AI in one area before expanding.
  • Train teams early. AI is only as good as the people using it.
  • Keep humans in the loop. AI can flag supply chain interruptions, but humans should make final calls.

A smooth rollout guides AI to add value instead of adding chaos.

Step 5: Monitor and Adjust for Continuous Improvement

AI isn’t plug-and-play. It’s an evolving system that needs constant calibration. Like the cycle of continuous improvement, it needs ongoing tweaks based on current data and user feedback. Even the best AI algorithms can drift, leading to errors.

How to Get It Right:

  • Set KPIs before rollout—track accuracy, cost savings, and efficiency.
  • Regularly audit AI outputs to catch errors early.
  • Adjust AI models based on historical data and changing market conditions.

Without monitoring, AI stops being a tool and starts being a liability.

5. What’s Next? The Future of AI in Global Supply Chain Management

AI is reshaping global supply chain management, but like RFID when it was new, its biggest impact is still ahead. The companies that adapt will gain efficiency, reduce risks, and drive smarter decision-making. The ones that don’t will struggle to keep up.

Emerging Trends:

  • AI + Blockchain for real-time supply chain partner transparency and fraud prevention.
  • Generative AI assisting in contract negotiations and vendor selection, cutting deal times.
  • AI-driven sustainability tracking to monitor carbon footprints and meet new regulations.
  • Hyper-personalized logistics using real-time AI decision-making to optimize routing and inventory.

For logistics pros, the challenge isn’t whether AI will change the industry. It’s how fast they can adapt. If you embrace data-driven AI, you’ll get a competitive edge.

Conclusion: Artificial Intelligence Is a Tool—Use It Wisely

AI isn’t a magic bullet, but when applied strategically, it can transform supply chain management. The key? Start small, clean your data, and focus on solving real problems. Artificial intelligence works best as a decision-making partner. Companies that use it wisely will see ROI. Those that don’t risk wasted budgets and failed implementations.

Unlock AI’s Full Potential with CNXN Helix

AI can drive real ROI—but only when it’s deployed in the right way. CNXN Helix helps mid-sized businesses implement successful AI solutions, from demand forecasting to supply chain risk management. Our experts cut through the hype and deliver AI strategies tailored to your business needs.

Get started today. Contact the CNXN Helix Center of Applied AI and Robotics to assess your biggest bottlenecks and build an AI roadmap that delivers results.

]]>
AI in Retail: Smarter Inventory and Dynamic... https://community.connection.com/ai-in-retail-smarter-inventory-and-dynamic-pricing/ Apr 01, 2025 Brian Gallagher https://community.connection.com/author/brian-gallagher/ https://community.connection.com/wp-content/uploads/2025/03/2986921-AI-in-Retail-BLOG.jpg

Meet Emma. She’s the founder of a mid-market clothing retailer. She doesn’t worry about stockouts or pricing wars anymore. Her AI inventory system predicts demand before it spikes, keeping bestsellers in stock without overordering. Meanwhile, AI customer insights help personalize her marketing campaigns. Her conversions look like Amazon on Cyber Monday.

Emma may be fictional, but her story is happening right now throughout the retail industry. It starts with cleaning data and beating the high costs of AI implementation. Retailers also have to integrate AI tools with their existing inventory and POS—without disrupting daily operations.

Retail is all about:

🔹 Customer experience: Fast shopping and personalized service.
🔹 Supply chain optimization: Keeping shelves stocked and cutting waste.
🔹 Pricing strategies: Setting competitive prices that fuel profits.

AI is already supercharging all these core areas of retail. This article breaks down the increasing role of AI in retail. We also cover use cases and how-tos to help you start your pilot programs.

What Is AI in Retail?

AI in retail is artificial intelligence that analyzes data, predicts demand, and optimizes operations to improve shopping experiences and increase sales. It powers personalized recommendations, automated inventory management, and cashier-less stores, making retail faster, smarter, and more efficient.

Retailers use AI for:

  • Fraud detection: Spotting unusual transactions before they cause damage.
  • Demand forecasting: Predicting inventory needs to prevent stock issues.
  • Pricing optimization: Adjusting prices based on demand and competition.
  • Visual recognition: Automating checkout and improving store layouts.

 

Examples: Sephora uses AI to personalize beauty recommendations. Nike’s AI-driven supply chain puts products where and when customers need them. AI insights also help retailers fine-tune marketing. AI tools can predict trends and improve the shopping experience across digital and physical stores.

As AI evolves, expect even smarter shopping experiences, with hyper-personalized promotions that know what customers want before they do.

AI Use Cases in Retail

AI adoption in retail is at 42%, and another 34% of retailers have started pilot programs. Why? Because it delivers real results—lower costs and stronger consumer engagement. AI is optimizing distribution and preventing fraud. It’s also driving smarter marketing campaigns and analyzing customer feedback. Here’s how:

Optimizing Inventory Management and Supply Chain Operations
Managing retail inventory without AI can feel like trying to restock shelves blindfolded. Overstock drains profits, while stockouts frustrate customers. AI changes the game by slashing waste. Here’s how artificial intelligence is changing the retail industry.

  • AI-powered Demand Forecasting: AI analyzes valuable customer data to predict sales. For instance, Target uses AI-driven sales forecasting to adjust inventory. Their platform analyzes weather and local buying habits to keep inventory levels tight. Models used: time-series analysis, neural networks, regression models.
  • Automated Inventory: AI tracks stock in real time, reducing shortages and manual errors. For instance, Zara’s AI-driven system optimizes assortment planning. It uses RFID sensors and computer vision to track stock and sell the right mix of products in each location. Models used: machine vision, reinforcement learning, decision trees.
  • Supply Chain Optimization: AI maps out retailers’ logistics networks, optimizing routes and warehouse locations. AI additions to UPS’s ORION system optimize delivery routes by analyzing vast real-time logistics data. The system factors in traffic and weather to create the most efficient delivery routes. It also analyzes package volume to save fuel and cut delays. Models used: graph-based optimization, reinforcement learning, clustering algorithms.
  • Predictive Analytics for Logistics: AI can spot sales and supply patterns before they throw deliveries off track. For example, Walmart uses AI to track real-time supplier performance and predict disruptions. This lets them reroute shipments and adjust distribution plans before problems impact customer satisfaction. Models used: Bayesian networks, deep learning, anomaly detection.
  • Robotic Shelf Restocking and Smart Shelves: AI robots are scanning shelves to flag items that are low in stock. For instance, Best Buy’s Tally robot checks thousands of products daily. It tracks inventory with smart vision and keeps pricing accurate with digital shelf tags. Models used: computer vision, IoT analytics, reinforcement learning.

Improving Customer Experience and Engagement
While 73% of customers expect to use AI-powered chatbots, not all AI experiences are created equal. A clunky chatbot can frustrate shoppers more than it helps. To get it right, train AI on real customer interactions. Then integrate it with your CRM and set up handoffs to human support when needed.

  • Personalized Shopping Experiences: AI analyzes customer behavior to recommend new items. For instance, Nordstrom’s AI recommendation engine boosts conversions and increases average order value. Models used: collaborative filtering, deep learning, reinforcement learning.
  • AI-driven Customer Loyalty Programs: AI personalizes rewards based on shopping habits. For example, Starbucks’ loyalty program uses AI to analyze transaction data. It then delivers individualized offers that boost engagement and retention. Models used: machine learning, trend analysis, clustering algorithms.
  • Sentiment Analysis for Customer Feedback: AI scans social media and reviews to measure shopper satisfaction. For instance, H&M’s AI processes customer feedback in real time. It can flag issues early to improve product offerings and service. Models used: natural language processing (NLP), sentiment analysis, deep learning.
  • Conversational AI and Chatbots: AI chatbots can recommend products and answer questions. H&M’s chatbot fields product questions and offers styling suggestions to fight cart abandonment. Models used: NLP, reinforcement learning, generative AI.
  • AI-powered Voice Commerce: AI supports hands-free shopping via voice assistants. For example, Walmart’s Voice Order AI lets customers add items to their carts with Siri or with Google Voice. That increases customer satisfaction while increasing sales. Models used: Speech recognition, NLP, intent recognition.

AI in Price Strategies and Retail Business Insights
Picture this: You check your dashboard, and it’s nothing but wins. AI-powered dynamic pricing has adjusted your product prices in real time. Your store stays automatically competitive without slashing margins. Sales are up, profit per item is optimized, and customers are happy. From fraud detection to in-store promotions, AI can help you grow your revenue.

  • AI-driven Dynamic Price Optimization: AI adjusts prices based on rival pricing and demand. In one example, Amazon’s AI pricing algorithm updates prices every 10 minutes. But it’s not all peaches and cash registers. The tool has also landed the retailer in hot water for potential unfair competition. Models used: reinforcement learning, regression analysis, neural networks.
  • AI-powered Fraud Detection: Artificial intelligence can spot fraud in real time, via object detection and pattern recognition. CVS uses AI-driven fraud detection to flag suspicious coupon use and prevent inventory theft at self-checkouts. Models used: anomaly detection, machine learning, machine vision.
  • Digital Pricing Labels: AI-enabled shelves dynamically update prices and promotions. Kroger’s EDGE digital shelf system changes pricing in real time, displays promotions, and reduces the need for manual updates. However, it recently came under fire from several senators as a threat to consumer privacy. Models used: IoT analytics, reinforcement learning, machine vision.
  • AI for Retail Media and In-store Digital Signage: AI tailors in-store ads based on shopper demographics and behavior. Adidas’s London flagship store features smart fitting rooms equipped with RFID-enabled interactive mirrors. The mirrors recognize products brought into the fitting room and provide detailed information, letting customers request different sizes or colors without leaving the space. Models used: smart vision, deep learning, behavioral analytics.

Retail Operations and Workforce Optimization
Retail theft rose 93% in 2024, according to the National Retail Federation. Artificial intelligence is tackling these losses and improving store layouts, staffing, and operations. AI helps retailers process data from shopper behavior and generate valuable insights to improve security.

  • AI-optimized Store Layouts: AI analyzes motion patterns and shopping behavior to improve store design. Macy’s uses AI-driven heat mapping to track foot traffic. They use it to put high-margin products in prime locations, increasing sales. Models used: smart vision, clustering algorithms, behavioral analytics.
  • AI for Workforce and Retail Operations: AI automates scheduling and task assignments, helping digital and physical stores run smoothly. Walmart’s AI workforce tool predicts peak hours and optimizes employee shifts. This cuts overtime costs and keeps stores staffed during busy times. Models used: predictive analytics, reinforcement learning, decision trees.
  • AI for Theft Prevention and Shrinkage Reduction: Cameras and computer vision process data in real time to detect theft. Home Depot’s AI security system flags suspicious activity to cut losses at self-checkouts. Models used: object detection, anomaly detection, deep learning.
  • AI for Store Traffic Prediction and Customer Insights: AI forecasts foot traffic using weather, events, and past trends. Starbucks’ AI demand model predicts store-level traffic, helping adjust staffing and stock levels for peak times. Models used: time-series forecasting, neural networks, regression analysis.

Marketing and Customer Data Insights
Oh no. Your latest marketing campaign just flopped. The discounts were too shallow, and the email blast barely moved the needle. Meanwhile, your competitors are using AI to predict demand and send the right promotions at the perfect moment. Without AI, you’re stuck wasting budget while customer retention slips away.

  • AI-powered Marketing Campaigns: AI processes data to draft highly targeted promotions. For instance, Nike’s AI-driven campaigns analyze shopper history to personalize ads and increase conversions. Models used: machine learning, trend analysis, clustering algorithms.
  • Behavioral Analytics for Customer Engagement: AI tracks shopping habits, sentiment, and retention trends. Sephora’s AI analytics predict which customers are likely to churn, then trigger retention offers before they leave. Models used: behavioral analytics, deep learning, sentiment analysis.
  • AI for Omnichannel Retailing: AI connects in-store and online experiences. In one example, Walmart’s AI-powered fulfillment system fills online orders from the closest store. Models used: predictive modeling, reinforcement learning, logistics optimization.
  • AI for Hyper-personalized Email and SMS Marketing: AI tailors outreach based on customer intent, preferences, and purchase history. Amazon’s AI email engine recommends products customers are most likely to buy. This can drive repeat purchases and boost retention. Models used: natural language processing, recommendation engines, deep learning.

Security and Retail Risk Management
Retail fraud is projected to exceed $100 billion annually. AI technologies help retailers head off fraud and churn. From digital sensors to predictive analytics, AI secures transactions and strengthens trust.

  • AI for Fraud Detection: AI monitors transactions to detect fraud. For example, CVS uses AI at self-checkouts to spot suspicious scan-and-bag behaviors. Models used: anomaly detection, machine learning, machine vision.
  • Predicting Customer Churn and Retention: AI flags at-risk customers before they leave. For instance, Spotify’s AI retention model can sense when users are about to cancel. It then sends personalized offers to keep them engaged. Models used: predictive analytics, deep learning, behavioral modeling.

Challenges of AI Adoption in Retail

Even though AI promises frictionless shopping, 74% of retailers struggle with AI adoption. From high costs to motion analytics accuracy issues, challenges remain. Even giants like Walmart and Target are still refining their AI strategies.

ChallengeImpactSolution
High AI Software Implementation CostsAI requires major upfront investment in software, infrastructure, and training.Start small—pilot AI in one area (like motion analytics for store layouts) before scaling.
Data Privacy and Ethical ConcernsAI collects vast amounts of shopper data, raising privacy risks.Ensure GDPR and CCPA compliance, encrypt data, and be transparent about AI’s role.
Data Cleaning and Quality IssuesAI is only as good as the data it processes. Bad data leads to bad predictions.Regularly clean, structure, and update datasets—automate where possible to improve accuracy.
AI System Inefficiencies and InaccuraciesAI isn’t perfect—forecasting errors can lead to overstock or empty shelves.Use continuous data updates and human oversight to refine AI models and reduce errors.
Impact on Workforce Roles and AutomationAI streamlines tasks but changes job roles.Retrain staff for AI-assisted positions, shifting them to higher-value activities.

Best Practices for Implementing AI in Retail

AI won’t fix a broken strategy—it will only automate the chaos. Retailers who rush into AI without a clear plan often end up with mispriced products. They can also suffer from inaccurate demand forecasts or wasted tech investments. Whether you need sales prediction or fraud detection, success starts with a structured approach. Follow these best practices to drive real business impact.

1. Define Business Goals
Pinpoint where AI technologies can solve problems. Are you optimizing inventory, improving customer interaction, or reducing fraud? Set measurable objectives to track success—like cutting stockouts by 20% or increasing conversions by 15%.

2. Choose the Right Technology Partner
AI success depends on an IT partner that targets compliance, security, and integration. The CNXN Helix Center for Applied AI and Robotics delivers AI technologies that follow GDPR and CCPA, use encrypted storage, and provide expert guidance. Our experts help leading retailers navigate AI complexities and drive real business impact.

For more information about the CNXN Helix Center for Applied AI and Robotics, contact your Account Team or drop us a line at AI@Connection.com

3. Collect and Clean Data
AI is only as good as its data. Ensure structured, high-quality datasets from sales, customer behavior, and supply chains. Remove duplicates and errors. If the data is messy, AI will produce unreliable insights.

4. Ensure Ethical AI and Data Privacy
Retail AI processes vast amounts of customer information—protecting it is critical. Use encrypted storage, comply with GDPR and CCPA, and ensure transparency. Customers should know how their data is used and have control over their preferences. Ethical AI builds trust and prevents compliance risks.

5. Choose the Right AI Tools
Select AI platforms that fit your needs. Cloud-based solutions like Azure, AWS, or Google Cloud offer scalability, while retail-specific AI tools provide targeted insights. A strong technology partner can accelerate deployment and help you meet customer expectations.

6. Train AI Models
AI learns from past data to make accurate predictions. Start with a small dataset, test different algorithms, and refine the model. Continually update AI with fresh data to improve performance over time.

7. Integrate AI with Existing Systems
AI should work seamlessly with POS, CRM, and inventory management tools. Connect data sources for real-time insights. This integration streamlines operations and lets employees focus on higher value activities instead of manual tasks.

8. Test, Adjust, and Scale
Run AI models in parallel with existing systems. Compare results, tweak settings, and scale once accuracy is proven. AI isn’t a one-and-done deployment—it requires ongoing optimization based on performance data.

Best AI Tools and Solutions for Retailers

Even though 87% of retailers have adopted AI, many still struggle to pick the right tools. From Amazon’s recommendation engine to Lowe’s in-store assistants, here’s how leading AI solutions are reshaping the retail business.

ToolUse CasesAdoptionReception
IBM WatsonCustomer interactions, personalized shoppingUsed by major retailersEnhances interactions; some find integration complex.
Microsoft Azure AISales prediction, inventory optimizationWidely used in retail business sectorImproves efficiency; requires substantial setup.
Google Cloud AIAI analytics, fraud detection, supply chainAdopted by enterprise retailersStrong AI tools; requires expertise to implement.
Amazon PersonalizeAI-driven recommendations, dynamic pricingUsed by thousands of retailersBoosts conversions; recommendations can feel repetitive.
Salesforce EinsteinAI-powered CRM, shopper intelligence, automationUsed by 150,000+ businessesAutomates engagement; setup can be complex.
Zebra SmartSightShelf-scanning robots for inventory managementUsed by Walmart, Best BuyPrevents stockouts; raises automation concerns.
OpenAI GPTAI chatbots, automated product descriptionsGrowing adoption in retailImproves service and product descriptions; needs oversight.
SAP AIAI-driven supply chain and logistics optimizationUsed by enterprise retailersIncreases efficiency; requires strong data integration.
Oracle Retail AIAI demand forecasting and fraud detectionUsed by global retail industry chainsReduces loss; models require ongoing training.
NVIDIA AIAI-accelerated computing for machine learningUsed by top retailers and cloud providersEnhances AI speed and efficiency; needs compatible infrastructure.
Intel AIAI hardware acceleration for retail industry applicationsUsed by enterprise retailersBoosts AI performance; integration can be complex.
AMD AIAI processing for edge computing and analyticsGrowing in retail AI systemsImproves efficiency; needs compatible software solutions.
Qualcomm AIAI for mobile and edge retail applicationsUsed by smart retail solutionsEnables edge AI; limited to supported hardware.
Lenovo AI SolutionsAI-powered infrastructure for retail operationsUsed by retail enterprisesProvides end-to-end AI solutions; requires tailored deployment.
Dell AI SolutionsAI-driven automation and analytics for retailWidely used in large retailersStrong enterprise capabilities; requires customization.
HPE AIAI-driven retail analyticsUsed by enterprise retailersPowerful processing; needs IT expertise for implementation.

The Future of AI in Retail: What’s Next?

AI is shifting from a backend tool to a frontline experience. Amazon’s Just Walk Out stores remove checkout entirely, while Sephora’s AI beauty advisor delivers hyper-personalized recommendations. Expect AI to refine touch-free shopping, real-time inventory tracking, and sales forecasting. In the coming months, AI will continue to help retailers cut costs and deliver precision-driven customer experience.

Retailers that fail to adapt risk falling behind. AI will quickly respond to shifts in demand, personalize promotions at scale, and manage the supply chain like never before. Companies like Walmart and Nike are already integrating AI into logistics, pricing, and product launches, proving that automation isn’t just a trend—it’s the new competitive edge.

Take the Next Step with CNXN Helix
AI is transforming retail, but success depends on the right strategy and the right partner. Most retailers have either fully adopted AI or have launched pilot programs. Yet many are failing to realize ROI. They face significant challenges with cleaning and managing data and integrating artificial intelligence with legacy business processes.

That’s where Connection comes in. With the CNXN Helix Center for Applied AI and Robotics, Connection delivers:
Tailored AI solutions—Custom-built to fit your business needs
Integration—AI that works with your existing systems, not against them
Industry-leading partnerships—AI solutions built with NVIDIA, Intel, AMD, AWS, Google Cloud, and Microsoft Azure to power your AI success

Retail leaders like Walmart and Amazon are setting the pace—will you keep up? Let’s build your AI advantage today.

For more information about the CNXN Helix Center for Applied AI and Robotics, contact your CNXN Account Team or drop us a line at AI@Connection.com

]]>
Human[X]: A Thought-provoking Dive into... https://community.connection.com/humanx-a-thought-provoking-dive-into-ais-uncharted-future/ Mar 27, 2025 Jamal Khan https://community.connection.com/author/jamal-khan/ https://community.connection.com/wp-content/uploads/2025/03/2995988-HumanX-Conference-BLOG.jpg

Last week, I attended Human[X], a conference that, at first glance, was an unknown quantity. In an era where nearly every event markets itself as “AI-focused,” picking the right one can be a gamble. However, the lineup of speakers intrigued me, and despite my initial skepticism, I walked away impressed—so much so that Human[X] is now on my annual must-attend list. With a caveat that will come later.

Hosted at the Fontainebleau in Las Vegas, the event struck an ideal balance—large enough to house numerous insightful discussions yet intimate enough to facilitate meaningful interactions. The smaller audience turned out to be an advantage, allowing attendees to fluidly move between sessions, engage in deeper conversations, and extract more value. The challenge for next year? Scaling without losing this magic mix of intimacy, accessibility, and quality discourse. Hence the caveat.

The AI Landscape: Innovation Outpacing Application

One undeniable truth emerged from Human[X]: AI innovation is accelerating at an almost unmanageable speed, but real-world enterprise value is still being defined. The conference underscored a landscape in which AI models, expert systems, assistants, Agentic frameworks, AI-enhanced applications, and embedded intelligence are evolving at a breakneck pace. Yet paradoxically organizations are still searching for the right use cases that balance business impact with feasibility.

This moment feels eerily reminiscent of the early Web era—a time when technology was advancing faster than businesses could absorb it, giving birth to an entirely new ecosystem. The same is happening with AI: while the fundamental building blocks exist, we are in the formative stages of discovering the real economic drivers of AI adoption.

Key AI Themes from Human[X]

Among the many discussions, some themes stood out as particularly critical for the near-term and long-term evolution of AI:

1. Agentic Frameworks: AI as Autonomous Decision-Makers

One of the most exciting (and troubling) developments discussed was the rise of Agentic frameworks—AI systems that not only analyze and recommend but autonomously execute tasks within a defined scope. This marks a fundamental shift from AI as an assistant to AI as an active business participant.

For example, OpenAI’s Auto-GPT and BabyAGI frameworks are early attempts at AI agents capable of independently breaking down complex tasks and iterating towards goals. Research from McKinsey suggests that AI-driven process automation could replace up to 30% of business tasks in key sectors like finance, law, and healthcare by 2030. If refined and broadly adopted, these systems could automate entire business functions, significantly reducing the need for human oversight.

2. Trust Models for AI: The Precursor to Scale

Trust remains a critical barrier to AI adoption. As with cloud computing in its early days, organizations will not deploy AI at scale without confidence in security, bias mitigation, explainability, and regulatory compliance.

At Human[X], speakers repeatedly emphasized the need for trust frameworks—a structured approach to ensuring AI is deployed responsibly. Examples include:

  • Microsoft’s Responsible AI Framework, which integrates transparency and risk assessment into AI deployments.
  • NIST’s AI Risk Management Framework, a U.S. government-led initiative aimed at standardizing AI governance.
  • EU’s AI Act, which seeks to categorize AI applications by risk level, limiting use in high-risk scenarios like biometric surveillance.

Without a widely accepted AI trust model, enterprises will remain hesitant, and regulatory ambiguity will continue to serve as a brake on adoption.

3. AI for Cybersecurity: Automating the “Grunt Work”

Cybersecurity is a domain where AI is already making an impact—albeit in a limited way. Most cybersecurity tools today leverage AI for anomaly detection, log analysis, and threat intelligence, but we are rapidly moving towards autonomous cybersecurity agents capable of defending networks without human intervention. According to Gartner, AI-driven Security Operations Centers (SOCs) are projected to reduce manual cybersecurity workloads by 40% by 2027, thanks to AI’s ability to detect and respond to threats faster than human analysts.

Discussions at Human[X] revolved around AI’s role in:

  • Automating Security Operations Centers (SOCs): AI handling Tier-1 security tasks, reducing false positives, and allowing human analysts to focus on critical threats.
  • Threat Hunting with AI: AI-driven systems proactively seeking vulnerabilities rather than reacting to attacks.
  • Self-Healing Networks: AI autonomously responding to breaches, mitigating attacks before human intervention.

While in the short run AI will not fully replace human expertise in cybersecurity, its ability to automate repetitive tasks and augment human analysts is already proving invaluable.

The Elephant in the Room: AI and Job Displacement

One major frustration I had at Human[X]—and at many AI conferences—is the unwillingness to address the job displacement debate with honesty. Many speakers contorted themselves to emphasize AI’s role in enhancing productivity rather than eliminating jobs. While this is partially true, it fails to acknowledge the inevitable second-order effects.

Short-term: The Rise of the AI-augmented Worker

In the near term, AI will boost individual productivity. Employees will be expected to operate at a higher level, leveraging AI as a force multiplier. This aligns with Atif Rafiq ’s concept of a “higher bar for employee excellence”—where workers must bring more value to remain relevant.

Long-term: AI Will Have a Net Negative Impact on Jobs

However, long-term job growth claims are questionable at best. Studies from the World Economic Forum (WEF) suggest AI will create 97 million new jobs by 2025, but these figures fail to account for the proportionality of job losses versus job creation. Additionally, WEF’s job growth goals due to AI by 2025 seem overly optimistic.

  • McKinsey’s research predicts up to 800 million jobs could be lost to automation by 2030.
  • A 2023 MIT study found that while AI does create new jobs, most require specialized skills that displaced workers do not possess.
  • Goldman Sachs estimates that AI-driven automation could replace 300 million full-time jobs globally.

I remain deeply skeptical about workforce re-skilling initiatives closing this gap. The narrative that AI job displacement will be balanced by job creation lacks empirical validation. Historical evidence of “re-skilling” failures due to the lack of policy and regulatory support, to address industrial automation in the 80's and 90's, eventually gave us the rust belt and its social and political impact. Those past experiences should be a clarion call for political and policy makers around the world. It is not just about new jobs emerging, which in itself seems to be a tall order—it is also about whether the displaced workforce can transition into them.

Instead of ignoring this reality, we need real policy discussions about how to manage workforce transitions, develop re-skilling programs, and mitigate economic fallout. Pretending job displacement isn’t happening does not make it any less real.

Final Thoughts: The Road Ahead

Human[X] successfully captured the complexity, diversity, and velocity of AI’s evolution. The major takeaways?

✔️ AI’s trajectory is moving faster than businesses can absorb.

✔️ Agentic AI frameworks are poised to transform business operations.

✔️ Trust models are essential for AI adoption at scale.

✔️ AI’s role in cybersecurity is growing rapidly.

✔️ AI-driven job displacement is real, and we must start discussing it openly.

AI will disrupt almost every industry, and the speed of disruption will surprise everyone. The only question is: Are we prepared for the transformation that is coming?

On to the next conference. NVIDIA GTC.

What are your thoughts? Are we being honest enough about AI’s impact on jobs? Let’s have the conversation that needs to happen.

#AI #ArtificialIntelligence #AIinBusiness #FutureOfWork #AITrust #JobDisplacement #AIRegulation #AIAdoption #AIConferences #HumanX #NvidiaGTC #CNXNHelix #WeSolveIT #WeSolveAI #ConnectionIT[GenerativeAI was used in the creation of this blog post

]]>
TechSperience Episode 136: Future-proofing... https://community.connection.com/techsperience-episode-136-future-proofing-it-operations-with-aiops-and-automation/ Mar 20, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/03/2963250-TechSperience-Episode-136-Blog-BLOG-1.jpg

AIOps is revolutionizing IT operations, enabling businesses to manage infrastructure more efficiently than ever. In this episode, we explore how automation, AI, and machine learning are reshaping IT management—driving self-healing systems, autonomous operations, and proactive performance optimization.

As organizations scale, ensuring resilience, minimizing downtime, and controlling costs are more critical than ever. Join us for a forward-looking discussion on the key trends fueling AIOps adoption and the strategies IT leaders can use to future-proof their operations. Expect actionable insights on leveraging AI-driven automation for smarter, more scalable, and cost-effective IT management.

Speakers: Taruna Gandhi, Head of Product Marketing for OpsRamp, Hewlett Packard Enterprise
Cameron Bulanda, VP of Technical Sales and Centers of Excellence, Connection

Show Notes:

00:00 Introduction to AI in Operations

03:11 The Importance of Observability

05:54 Breaking Down Silos with AIOps

09:10 Real-life Examples of Self-healing Systems

12:02 Proactive Management with AI Insights

14:51 Leveraging Anonymized Data for Better Insights

18:05 Future of AI in Operations Management

20:49 Driving Sustainability with AI 23:59 Choosing the Right Partner for AI Solutions

23:59 Choosing the Right Partner for AI Solutions

]]>
Why Enterprises Should Consider AMD and AI... https://community.connection.com/why-enterprises-should-consider-amd-and-ai-pcs-in-2025/ Mar 18, 2025 Gaston Sandoval https://community.connection.com/author/gaston-sandoval/ https://community.connection.com/wp-content/uploads/2025/03/2954271-AMD-AI-PCs-BLOG.jpg

AI PC capabilities have evolved rapidly in the two years since AMD introduced the first x86 AI PC CPUs at CES 2023. New neural processing units have debuted, pushing available performance from a peak of 10 AI TOPS at the launch of the AMD Ryzen™ 7 7840U processor to peak 50+ TOPS on the latest AMD Ryzen™ AI Max PRO 300 Series processors. A wide range of software and hardware companies have announced various AI development plans or brought AI-infused products to market, while major operating system vendors like Microsoft are actively working to integrate AI into the operating system via its Copilot+ PC capabilities. AMD is on the forefront of those efforts and is working closely with Microsoft to deliver Copilot+ for Ryzen™ AI and Ryzen™ AI PRO PCs.

In the report “The Year of the AI PC is 2025,” Forrester lays out its argument for why this year is likely to bring significant changes for AI PCs. Forrester defines the term “AI PC” to mean any system “embedded with an AI chip and algorithms specifically designed to improve the experience of AI workloads across the computer processing unit (CPU), graphics processing unit (GPU), and neural processing unit (NPU).” This includes AMD products, as well as competing products made by both x86 and non-x86 CPU manufacturers. 2025 represents a turning point for these efforts, both in terms of hardware and software, and this Forrester report is an excellent deep dive into why AI PCs represent the future for enterprise computing.

Why 2025 Is the Year of the AI PC

First, and arguably most prosaically, 2025 is the year of the AI PC because CPU designers like AMD are baking dedicated AI processing into new products just as Windows 10 is approaching the formal end of support. While consumers and businesses will have the option to buy extra years of security updates, October 14, 2025 is the end of free support and the likely transition date to Windows 11 for millions of companies worldwide that don’t want to pay an increasingly large yearly premium for the privilege of running an older operating system. Many of the Windows 11 commercial PCs purchased in the next 12 months are going to support AI via dedicated NPU hardware and will therefore qualify as AI PCs. According to IDC, 93.9% of commercial PCs will be AI PCs by 2028.

The advantage of this distribution model is that customers and early adopters are not being asked to sacrifice the CPU and GPU efficiency improvements they would otherwise expect from new systems in exchange for AI compatibility. The AI PC platforms AMD launched throughout 2024 and into 2025, including the AMD Ryzen™ AI PRO 300 Series and the Ryzen AI Max PRO Series feature improvements like new CPU cores based on the “Zen 5” microarchitecture, AMD RDNA™ 3.5 GPU cores, and AMD PRO Technologies to provide security features, manageability tools, and enterprise-grade stability and reliability. These advances arrived alongside the NPU and its dedicated AI processing capability, ensuring AMD customers continue to benefit from new systems in the legacy non-AI workloads they already use.

Second, AI PCs offer a flexible platform that executes emerging AI workloads across the CPU, GPU, and NPU, depending on where it will run best. It’s not hard to see the parallels between NPU development and the GPU-powered 3D revolution that kicked off back in the mid-1990s.

While specialized 3D graphics initially required the use of discrete graphics accelerator cards, this hardware eventually moved on-die and was integrated directly alongside the CPU. This allowed AMD to dramatically improve the performance and capability of the integrated graphics processor, or iGPU, without driving up cost. Many applications now rely on the iGPU for tasks that previously required a discrete card, and graphics capabilities have become an integral part of a CPU’s overall value proposition. Adding an NPU to the CPU and steadily increasing its performance is in line with this innovation. Including an NPU gives developers a dedicated platform for running AI workloads, one that’s tightly knitted to the CPU and that can be expected to steadily improve over time.

Today, AI PCs can already offload background tasks that would otherwise require CPU or GPU processing time. More advanced use cases are rapidly proliferating. AMD introduced the first x86 AI PCs in 2023. Two years later, the company’s CES 2025 exhibition showcased multiple practical demos. This rapid adoption highlights the NPU’s potential efficiency and strength as a workload execution environment compared to the more traditional CPU or GPU.

While CES is largely a consumer-focused show, the broad discussion of AI across both consumer and commercial workloads shows the appeal of integrating artificial intelligence across the entire software market. According to its report, Forrester expects AI and AI PCs to take significant strides towards wide deployment over the next 12 months. This will happen thanks to the Windows 11 refresh cycle, wider AI incorporation across the software industry, and because the increased availability of AI PCs will give companies a cost-effective platform for less intense local AI workloads.

What all of this adds up to is an exciting time for enterprise computing. AI PCs have the potential to revolutionize the industry, and AMD offers a complete lineup for all end-user needs that gives companies extensive options to evaluate how they want to engage with that opportunity.

]]>
The Future of Self-Service: How Your... https://community.connection.com/the-future-of-self-service-how-your-business-can-transform-customer-experiences/ Mar 13, 2025 Tanya Tretyak https://community.connection.com/author/tanya-tretyak/ https://community.connection.com/wp-content/uploads/2025/03/2950171-Zebra-Kiosk-Email-Program-BLOG.jpg

The demand for self-service solutions is rapidly growing across industries as businesses seek innovative ways to enhance efficiency and meet evolving customer expectations. From retail and healthcare to quick-service restaurants and hospitality, self-service kiosks are transforming how companies interact with their customers by delivering faster service, greater convenience, and improved operational performance.

Zebra’s recently launched Android-based kiosk solution is at the center of this transformation, ready to streamline operations and elevate the customer experience. Let’s explore the key trends driving the self-service revolution and how your business can leverage these solutions for long-term success.

Self-Service Is Revolutionizing the Customer Experience

Consumers today are more tech-savvy than ever. They want speed, autonomy, and control over their transactions. Whether it’s ordering food, checking into a hotel, purchasing tickets, or even registering for healthcare appointments, self-service technology is reshaping the customer journey.

Several key factors are fueling the rapid adoption of self-service solutions:

  • Changing Consumer Expectations
    Customers expect seamless digital interactions that mirror their online experiences. Self-service kiosks provide an intuitive, user-friendly interface that allows for faster service and reduced wait times—critical factors in driving customer loyalty.
  • Labor Shortages and Rising Costs
    With many industries facing labor shortages, businesses are turning to automation to fill service gaps without compromising customer experience. Self-service kiosks help businesses reduce dependency on staff while maintaining high-quality service.
  • Operational Efficiency
    Self-service technology streamlines workflows, reducing bottlenecks and improving transaction accuracy. By automating routine tasks, employees can focus on higher-value interactions, such as personalized customer assistance.
  • Enhanced Security and Contactless PaymentsIn a post-pandemic world, contactless transactions have become a standard expectation. Self-service kiosks minimize physical interactions, ensuring a safer environment for both customers and employees.

Industry Use Cases: Where Self-Service Kiosks Are Making an Impact

Retail: A Frictionless Shopping Experience
Retailers are leveraging kiosks for self-checkout, product information lookup, and in-store ordering. These solutions eliminate long checkout lines, enhance inventory visibility, and provide personalized recommendations through AI-powered interfaces.

Quick-Service Restaurants: Speeding Up Food Orders
Fast food chains and QSRs are implementing self-service kiosks to allow customers to customize their orders, pay seamlessly, and receive faster service. This not only reduces order errors but also boosts revenue through upselling features.

Healthcare: Improving Patient Check-in and Accessibility
In the healthcare industry, self-service kiosks streamline patient registration, appointment scheduling, and insurance verification. These solutions enhance patient privacy and efficiency while reducing administrative burdens.

Hospitality: Elevating Guest Experiences
Hotels and resorts use self-service kiosks for express check-ins / check-outs, room key dispensing, and booking attractions and services. This creates a more personalized and hassle-free experience for travelers.

Transportation: Effortless Ticketing and Boarding
Airports, train stations, and public transit systems have widely adopted self-service kiosks for ticket purchases, baggage check-in, and boarding pass printing, significantly reducing wait times and improving traveler flow.

The Zebra Kiosk System: Drive Customer Engagement and Growth

As businesses transition to self-service models, choosing the right technology is crucial. With Zebra’s cutting-edge kiosk solutions, businesses can reduce operational costs, enhance customer engagement, and drive higher revenue growth.

Zebra’s modular kiosk system is designed to meet the needs of a myriad of self-service, assisted checkout, and digital display use cases:

• Zebra’s KC50 15" / 22" Android kiosks and TC50 touch displays with a modular stand system support hundreds of configurationsTap to pay and digital wallet ready for one-stop shop operations  

• AI-capable personalization for upselling and recommendations

• Robust hardware built for high-traffic environments

• Cloud-based remote device management for real-time monitoring and updates

• Content-management software ready to push hyper localized content remotely in digital display use cases

Self-Service Is the Future. Is Your Business Ready?

The self-service revolution is here. Businesses that embrace automation will gain a competitive advantage. By integrating Zebra’s innovative kiosk system, companies can elevate customer experiences, improve efficiency, and future-proof their operations.

Contact us today and learn how Connection can help make Zebra’s kiosk solutions your competitive advantage.

]]>
6 Compelling Reasons IT Managers Should... https://community.connection.com/6-compelling-reasons-it-managers-should-consider-modernizing-to-amd-epyc-processors/ Mar 11, 2025 Dennis McQueen https://community.connection.com/author/dennis-mcqueen/ https://community.connection.com/wp-content/uploads/2025/03/2976521-Blog-AMD-EPYC-BLOG.jpg

Today’s IT managers face immense pressure for a variety of reasons: they need to activate new revenue streams, develop strategies to support changing workforce requirements, and integrate new capabilities to improve efficiency within the business. While upgrading to new servers can alleviate these issues, cost is often perceived by many as a barrier. But what if I told you the cost of these new servers could be recouped in as little as two months?

In this short blog, I will share these and other key takeaways from our recent white paper, Modernize Your Data Center Virtualization with AMD EPYC™ Processors.

Modernization Is Needed

The average age of today’s data center servers is over five years. Legacy infrastructure can pose significant issues, including:

  • Requiring more servers to deliver a given level of performance, leading to more energy consumed and higher maintenance/operational costs,
  • Outdated process technology that can decrease energy efficiency, increase unplanned downtime, and are more prone to security vulnerabilities

Server refresh, which may have been difficult to justify in the past, is becoming increasingly justifiable. AMD EPYC™ processors address these issues, offering a cost-effective pathway to modernization without sacrificing performance. As you’ll see below, by upgrading to AMD EPYC™-powered servers, IT teams can boost performance while consolidating their servers, optimizing software licensing costs, and enhancing overall data center efficiency.

The Economics of Upgrading to AMD EPYC™ Servers Are Compelling

Consider a setup with 1,000 servers, each running the widely deployed dual 28-core Intel® Xeon® 8280 processors and delivering an aggregate VMmark® 3.1 matched pair performance score of ~9020. Upgrading to just 336 servers with dual 48-core AMD EPYC™ 9474F processors delivers similar performance while reducing the number of servers by nearly two-thirds.

This upgrade lowers energy usage by an estimated 47% and slashes licensing costs by up to 42%, providing a swift payback period—which can be as short as two months.*

Software licensing expenses make it increasingly costly to continue operating legacy infrastructure. Modernizing can slash costs, boost performance, and create space for emerging applications, such as AI.

AMD Offers Simplified Migration in Collaboration with VMWare

For many IT managers, the perceived complexity of migrating virtual machines from Intel-based servers to AMD EPYC™ processor-based servers causes concerns. But the truth is that migrating from Intel to AMD servers involves similar effort as upgrading from legacy Intel servers to new ones.

To further simplify migration to AMD EPYC™ servers, AMD offers the VMware Architecture Migration Tool (VAMT), developed in collaboration with VMware. This PowerShell-based tool leverages VMware PowerCLI to automate virtual machine migrations across x86 architectures, helping reduce migration risks and downtime. The four-step process includes selecting which AMD EPYC™ processors align with current workloads, downloading and configuring VAMT, tagging the virtual machines, and then executing and validating the migration.

This tool offers a straightforward, reliable route to modernization without sacrificing workload continuity.

Moving to AMD EPYC™ Processors Instead of Intel Xeon Alternatives Can Provide TCO Benefits

If you compare the choice of using the 4th Gen AMD EPYC™ processors with 5th Gen Intel Xeon processors, you’ll find impressive total cost of ownership (TCO) benefits when you move to AMD EPYC™ processors.

Building on the example above of switching out 1,000 legacy servers, an enterprise would need 328 2P Intel Xeon Platinum 8592+ compared to 336 2P AMD EPYC™ 9474F servers. The EPYC™ choice gives you an estimated 24% lower TCO over five years, up to 33% lower hardware CAPEX, up to 23% fewer cores, and up to 23% lower VMware licensing costs.SP5TCO-073A

Managing and Mitigating Blast Radius Risks

A common concern with high-core count servers is the potential for a larger blast radius— the number of virtual machines affected by a single system failure. Two primary technology advancements mitigate blast radius concerns: software resiliency and enhanced hardware reliability. Most modern, distributed application architectures, such as those found in VMware® vSphere®, can be built to run multiple instances of each software application at once, so that if one VM fails, services remain uninterrupted as the application fails over to the other VM. From the perspective of hardware reliability, AMD EPYC™ processors include robust reliability, availability, and serviceability (RAS) features, designed to minimize downtime through advanced error correction and detection mechanisms, such as Advanced Memory Device Correction (AMDC).

These innovations provide a resilient infrastructure capable of maintaining application uptime even during hardware issues, significantly reducing concerns about potential downtime or failure impact.

AMD Supports the Advancement of Your Data Center Efficiency Goals

Choosing AMD EPYC™ processors for high-performance computing enables organizations to advance their data center sustainability goals by helping reduce energy consumption. These processors allow companies to decrease their environmental footprint and cut costs through reduced server quantities and licensing fees. Ultimately, opting for AMD EPYC™ processors over Intel® Xeon™ offers impressive total cost of ownership benefits, supporting both your financial and sustainability objectives.

Conclusion: A Future-ready Path with AMD EPYC™ Processors

The AMD EPYC™ processor-powered modernization approach offers a compelling case for any organization looking to boost performance, reduce costs, and meet efficiency targets. From significant savings in licensing and energy costs to enhanced reliability features, AMD EPYC™ processors pave the way for a seamless, future-ready data center.

]]>
Store-in-a-Box: How End-to-End Technology... https://community.connection.com/store-in-a-box-how-end-to-end-technology-deployment-changes-everything/ Mar 06, 2025 Brian Gallagher https://community.connection.com/author/brian-gallagher/ https://community.connection.com/wp-content/uploads/2025/02/2946176-Retail-Store-in-a-Box-Blog-BLOG.jpg

Can you relate to the stress of opening a new store? How about the stress of a technology refresh for every store all at the same time? The coordination of a million moving parts with a dozen different suppliers and contractors—opening a new store is stressful in so many ways. Whether I was opening a new store or one of our 100+ seasonal popups, the challenges were always the same when it came to technology.

I remember stacks of boxes sitting all over our POS labs and store operations. The reality was that my stack at the time was a network, POS system, mobile devices, and all the peripheral stuff. It was nowhere near as complicated as it is today. Yet we had to hope for so many things to happen perfectly, knowing we were just preparing for the guarantee of running into unexpected issues. All we could do was brace for the first phone call.

Retailers today are experiencing exponentially more challenges as they attempt to configure, integrate, and deploy a high volume of new technologies. They are navigating complex integration processes, a lack of system standardization, resource constraints, legacy systems, cost overruns, and security concerns—among other obstacles. For a more streamlined, scalable, and cost-effective configuration and integration process, organizations need expert support. The Connection Technology Integration and Distribution Center (TIDC) is well-positioned to provide that support. With an unwavering level of commitment to customer success, we design completely customized and tailored solutions to address these key challenges.

As a business leader, juggling the day-to-day operations is challenging enough. Adding in large scale projects can be an overwhelming process. Connection’s Store-in-a-Box Services have transformed the entire store opening and technology refresh process. The TIDC in Wilmington, OH is the one-stop shop for procurement, imaging, kitting, deployment, and depot services.

I can’t count the number of times I experienced things like:

  • Stacks of devices all over the office waiting for configuration and shipping
  • Billing complexity nightmares with products coming from a dozen different suppliers
  • Late store openings because equipment was missing or not working when expected
  • Contractors losing pallets of materials for a store opening or remodel
  • Employees saying they never received items you know for a fact were delivered

Connection Store-in-a-Box solutions start with our Quality-as-a-Mindset™ (QaaM™) methodology that empowers all employees to take ownership of quality. We provide every member of our team with the autonomy and tools needed to pursue continuous improvement, eliminate defects, and deliver the best possible client experience. By integrating QaaM™ protocols across all operations, adopting Lean methodologies, and holding ourselves accountable to world-class metrics, the TIDC ensures every client receives a flawless solution and enjoys an exceptional experience from start to finish.

What exactly can our Store-in-a-Box solutions deliver for you?

  • Procurement, Provisioning, and Imaging: A one-stop shop for purchasing and preparing your devices for successful deployment
  • Deployment Services: Cost-effective solutions to deploy and refresh equipment, including Program Management, Site Surveys, Technician Management, and Dispatch
  • Hot Swap and Depot Services: Advanced Exchange, Reverse Logistics, Repairs, In/Out Warranty, Authorized OEM Repair
  • ITAD: Achieve sustainability objectives through recycle, resale, and disposal; services include Preparations, Asset Management, Logistics, and Coordination

The technology complexity required to achieve success in retail is growing every day. Rely on a trusted partner with the expertise required to make your business a success.

]]>
UCaaS and AI: An Unbeatable Combination https://community.connection.com/ucaas-and-ai-an-unbeatable-combination/ Mar 04, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/02/2951471-Blog-UCaas-BLOG.jpg

“AI has just joined the meeting.”

That message is coming to a screen near you, and sooner than you think. But before you start worrying about AI agents eavesdropping on your conversations, consider the benefits of having an AI agent parse and summarize your meetings. Imagine that you miss the first thirty minutes of a meeting because of a scheduling conflict. An AI agent can now summarize the meeting for you, bring you up to speed, and even suggest questions that are relevant to your job to bring you into the conversation.

According to Paul Binder, Manager of Channel Partner Services at Connection and an avowed XaaS (Anything as a Service) expert, the preceding illustration is exactly the sort of beneficial effect that AI has on Unified Communications as a Service (UCaaS). In a recent webinar entitled Navigating the UCaaS Landscape, Binder, Terry Corder (Vice President of Product Management and Strategy, Fusion Connect), and Leon Wright (Senior Product Marketing Manager, Microsoft Teams) examined how organizations will derive more value from UCaaS in the near future by leveraging AI technologies in their voice, video, and text communications.

Making a Good Thing Better

Even before the advent of AI, the case for UCaaS was a compelling one: lower costs, enhanced features, better integration across the communications stack, etc. AI builds on the benefits of UCaaS by helping to turn voice, video, and text information into actionable data that can be queried in real time or used to facilitate real-time conversations with AI agents. As Leon Wright sees it, AI’s real superpower lies in its ability to empower frontline workers with better decisioning tools.

It’s a perspective shared by Paul Binder, who noted that knowledge workers are already using AI tools like Microsoft Copilot to provide call summaries and act as an intelligent assistant that remembers who said what. So, knowledge workers can join a call today and, using Copilot, not only ask it to recommend relevant questions but also indicate who on the call is the right person to ask. It’s these sorts of UCaaS use cases that highlight how AI can enhance, rather than replace, human workers—particularly when you consider that the majority of people still prefer to talk to human beings and will for the foreseeable future.

Unify Costs and Enhance Security

Despite the rapid evolution and broad potential of AI technology, the move to UCaaS is still driven primarily by the desire to reduce costs and complexity in the communications stack. Terry Corder cited the example of one customer, a credit union, that had been using traditional copper lines and legacy applications to connect more than two dozen branch locations. By moving to a UCaaS solution on an SD-WAN architecture, the credit union was able to improve security (through network encryption), customize their communications features for each employee based on their roles, and move from high, unpredictable bills to a lower, predictable monthly cost. Not only did the back-office workers benefit from these changes, but so did the frontline workers—in the case, the credit union’s tellers.  

UCaaS is about more than unified communications. It unifies costs, security and feature updates, and the company by giving everyone access to the same communications tools whether they’re in the office or working from home. All this unification naturally leads to higher employee productivity, as employees no longer have to spend hours each week toggling between different applications to attend videoconferences, call colleagues, send instant messages, and manage emails.

Mitigating Risk When Moving to UCaaS

Despite the rising popularity of UCaaS platforms, the move to UCaaS isn’t without its share of risk. Any time you’re dealing with mission-critical services such as telephony and email, a seamless move is crucial. For Paul Binder, Terry Corder, and Leon Wright, the adoption of UCaaS is a journey—one that depends on a clear and proven roadmap. In their experience, there are several potholes that organizations should watch out for when moving to a UCaaS platform:

  1. Mind your gaps. Assess your current communications gaps, including features that you’ll want in the future, and ensure that your UCaaS solution addresses those gaps.
  2. Detail your dependencies. Understand which tools and technologies will impact UCaaS, including integration points with Microsoft Teams and other critical applications.
  3. Calculate the cost upfront. The UCaaS solution itself is only part of the total cost. Transformation services (including potential downtime) and ongoing management should also be factored into the total cost of ownership.
  4. Expect delays. It will take time for some steps to complete, such as porting numbers from one service provider to another, so make sure you include that extra time in your transformation plans.
  5. Never on a Monday (or a Friday). Don’t plan moves on a Monday, because it’s the busiest day of the week, or on a Friday, since if something goes wrong you may not have the same level of technical support over the weekend.

Choosing the Right Partner(s) Is Key

Choosing the right partner—or partners—is critical to a successful transformation. Ideally, businesses will choose partners that work well together. For example, Connection, Fusion Connect, and Microsoft each bring their own unique expertise to UCaaS: Fusion Connect has the telephony and networking knowledge, Microsoft brings industry-leading communications and AI solutions to the table, and Connection is skilled in selecting, pricing, implementing, and managing UCaaS solutions for businesses of any size or industry.

Once a UCaaS solution is in place, businesses can then begin to overlay AI services on top of that solution. Here again, an experienced partner is important to help with things like tagging files for AI, setting security and access policies, integrating AI with existing CRM and call center applications, and creating test pilots for AI capabilities to measure their potential impact. Unification is really just the beginning. With AI layered on top, UCaaS has the potential to completely transform the user experience and the customer experience in any organization.

To learn more about UCaaS and AI, talk to your Connection representative or visit us online.

]]>
A Pulse on the State of Cybersecurity in... https://community.connection.com/a-pulse-on-the-state-of-cybersecurity-in-healthcare/ Feb 27, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/02/2940850-Cybersecurity-Healthcare-Infographic-BLOG.jpg

The healthcare industry is currently facing an unprecedented wave of cyber threats. In 2024 alone, healthcare organizations experienced an average of 1,999 attacks per organization, per week.1 This alarming trend is expected to continue as many healthcare providers struggle to update their security infrastructure to keep pace with the rapidly evolving cyber threat landscape.

One of the primary reasons for this vulnerability is the outdated infrastructure that many healthcare organizations still rely on. With AI driving the sophistication of cyber threats, these outdated systems pose significant risks to patient privacy, safety, and trust. The financial implications are also severe, with the average cost of a data breach in healthcare reaching $9.77 million, the highest across all industries.2

In 2024, 92% of healthcare organizations reported experiencing a cyberattack, marking a 4% increase from 2023.3Ransomware attacks have been particularly prevalent, accounting for up to 11% of all attacks in the sector.1 The average recovery cost of a ransomware attack has doubled since 2021, now standing at $2.57 million. 5

The statistics are stark:

  • 67% of healthcare organizations were victims of ransomware attacks in 2024.4
  • There was a 180% growth in attacks exploiting healthcare security vulnerabilities from 2023 to 2024.5

These figures highlight the urgent need for healthcare organizations to bolster their cybersecurity measures. Investing in modern, robust security infrastructure is not just about protecting sensitive information; it’s about safeguarding the trust and safety of patients.

For more detailed insights and to learn how your organization can enhance its cybersecurity posture, visit us online today. 

Sources: 

  1. Check Point Research Reports Highest Increase of Global Cyber Attacks seen in last two years – a 30% Increase in Q2 2024 Global Cyber Attacks: https://blog.checkpoint.com/research/check-point-research-reports-highest-increase-of-global-cyber-attacks-seen-in-last-two-years-a-30-increase-in-q2-2024-global-cyber-attacks/  
  2. Average Cost of a Data Breach Rises to $4.88M; Falls to $9.77M in Healthcare: https://www.hipaajournal.com/cost-healthcare-data-breach-2024/
  3. 2024 Ponemon Healthcare Cybersecurity Report: https://www.proofpoint.com/us/resources/threat-reports/ponemon-healthcare-cybersecurity-report
  4. The State of Ransomware in Healthcare 2024: https://www.sophos.com/en-us/solutions/industries/healthcare
  5. 2024 Data Breach Investigations Report Healthcare Snapshot: https://www.verizon.com/business/resources/Ta55/infographics/2024-dbir-healthcare-snapshot.pdf

To ensure transparency, please note that artificial intelligence and large language models may be utilized to enhance the content of this article. This approach helps refine and enrich the information presented, ensuring accuracy and depth.

]]>
TechSperience Episode 135: Unveiling the... https://community.connection.com/techsperience-episode-135-unveiling-the-hidden-threats-in-hyperconnected-healthcare-a-cyber-thrillers-take-on-iomt-security/ Feb 27, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/02/2952971-TechSperience-Ep135-UnveilingThreats-BLOG.jpg

In a world where hospitals rely on interconnected medical devices to save lives, there are continuous hidden vulnerabilities lurking beneath the surface. This podcast dives into the realities of IoMT security with John Chirillo, Principal Security Architect and author of the newly released novella Silent 1ntrusions, alongside cybersecurity expert Rob Di Girolamo, to break down the threats, the lessons, and what we can do to defend against them.

We’ll delve into topics Silent 1ntrusions’ main character, Dr. Kristi Chiro experiences as she battles a relentless hacker. A pacemaker’s glitch, insulin pumps go haywire, and an entire hospital teeters on the edge of collapse. In an era of hyperconnected healthcare, how safe are we really?

Speakers:
John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection
Kimberlee Coombes, Security Solution Architect, Connection

Show Notes:
00:00 Introduction to IOMT Security and Silent Intrusions
02:46 Real-World Inspirations Behind Silent Intrusions
06:10 Exploring IOMT Vulnerabilities in Healthcare
08:49 Challenges in Securing IOMT Devices
11:46 Attack Scenarios and Realistic Threats
14:50 Key Takeaways for Healthcare Professionals
18:08 Future Threats in Healthcare Security

]]>
Beyond the Buzz: Are Interactive Displays a... https://community.connection.com/beyond-the-buzz-are-interactive-displays-a-smart-investment-for-your-school-district/ Feb 25, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/02/2924085-Samsung-PSG-WAF-BLOG.jpg

Interactive displays have become a powerful way to enhance collaborative and interactive learning in modern classrooms. However, with a wide array of technology options available, determining whether they are the right investment for your district can be challenging. 

Here are some of the key considerations for selecting classroom technology and why an interactive display, particularly Samsung’s new AI-powered WAF interactive display, ultimately stands out from the alternatives:

Interactive Displays vs. Alternative Options

Student-centric Devices (1:1 Programs)

Equipping each student with a device like a tablet or laptop (for example, an iPad or Chromebook) allows for personalized learning experiences.

Pros:

  • Portability and individual learning experiences
  • Supports differentiated instruction and learning at individual paces
  • Provides access to a vast array of educational apps and online resources

Cons:

  • Significant upfront investment for schools
  • IT management challenges and the potential for misuse and distraction
  • May not be ideal for collaborative, whole-class activities

Interactive Projectors

Interactive projectors transform any flat surface into a collaborative workspace. They use special sensors to detect touches on the projected image on a whiteboard or wall. This allows users to interact with the computer using their fingers or a special pen—much like a touchscreen.

Pros:

  • Portability and flexibility in the classroom
  • Less costly than larger interactive displays
  • Can utilize existing surfaces, like whiteboards

Cons:

  • Image quality can be affected by lighting conditions
  • May have limited functionality compared to dedicated displays

Document Cameras

Document cameras function like high-tech overhead projectors. They use a camera to capture live images of documents, books, 3D objects, or even experiments, and then project those images onto a screen or interactive display. This allows teachers to show detailed visuals, zoom in on specific areas, and even annotate or manipulate the projected image in real-time.

Pros:

  • Affordable and easy to integrate into the classroom
  • Versatile for displaying various materials
  • Simple to operate with minimal training

Cons:

  • Limited interactivity and multimedia integration
  • Primarily a supplementary tool, not a central learning platform

Interactive Whiteboards (IWBs)

IWBs bring interactivity to the traditional whiteboard, allowing for multitouch input and integration with educational software.

Pros:

  • Multitouch capabilities for collaborative activities
  • Access to a wide range of educational apps and resources
  • Well-established technology with ample support

Cons:

  • Higher cost compared to some alternatives
  • May require specialized training for teachers
  • Can have limitations in flexibility and integration

Interactive Displays 

Interactive displays are all-in-one touchscreen systems that combine high-resolution displays, computing power, and interactive features within a single unit. They go beyond traditional whiteboards and projectors by offering a centralized hub for teaching and learning activities.

Pros:

  • Combines the functionality of multiple tools (whiteboard, projector, and computer)
  • Offer high-quality visuals and responsive touchscreens for engaging lessons
  • Facilitates collaborative learning and supports diverse learning styles
  • Can be a central hub for classroom activities and integrates with other technologies

Cons:

  • Higher upfront investment compared to some alternatives
  • May require some teacher training to utilize all features effectively

Key Considerations for Selecting the Right Classroom Technology

When deciding whether to invest in interactive displays versus other technologies, schools should evaluate several key factors to ensure they make an informed decision that aligns with their educational goals and operational needs.

Budget and Total Cost of Ownership

  • Initial Costs: Compare the upfront costs of interactive displays with traditional whiteboards and other alternatives. While interactive displays may have a higher initial investment, their long-term benefits and durability can offer better value.
  • Ongoing Expenses: Consider maintenance, support, and potential upgrade costs. Interactive displays often come with comprehensive support and warranty options that can reduce long-term expenses.

Integration and Compatibility

  • Seamless Integration: Ensure that the chosen technology can integrate smoothly with your school’s existing infrastructure, including Wi-Fi networks, learning management systems (LMS), and other educational tools.
  • Compatibility: Check for compatibility with various devices and software to maximize the utility of the technology across different platforms and applications.
  • Interoperability: Check for compatibility with various devices and software to maximize the unity of the interactive displays across different platforms and applications.

Assessing Educational Needs

  • Teaching Methods: Consider how the technology will support current and future teaching methods. For example, interactive displays and tablets can enhance multimedia-based lessons, while traditional whiteboards may better support straightforward, lecture-style teaching.
  • Enhancing Participation: Evaluate the potential of each technology to increase student engagement and participation. Interactive displays and tablets often provide interactive and multimedia-rich content that can make learning more engaging compared to traditional whiteboards.
  • Cater to Diverse Learning Styles: Ensure that the technology accommodates different learning styles—visual, auditory, kinesthetic—to provide an inclusive learning environment for all students.

User Training and Support

  • Ease of Use: Assess how user-friendly the technology is for both teachers and students. Intuitive interfaces can reduce the learning curve and encourage widespread adoption.
  • Training Programs: Ensure that adequate training and professional development resources are available to help educators effectively utilize the new technology.

Futureproofing and Scalability

  • Adaptability: Choose technologies that are adaptable to future technological advancements, including artificial intelligence (AI) integration and augmented reality (AR) capabilities. This ensures that your investment remains relevant as educational technology evolves.
  • Upgradability: Select solutions that can be easily updated or expanded to incorporate new features and functionalities without requiring a complete overhaul.
  • Scalability: Ensure that the solution can scale with the school’s growth, accommodating additional users and expanding functionalities as needed.

Security and Compliance

  • Compliance with Regulations: Verify that the chosen technology complies with relevant data protection regulations—such as FERPA (Family Educational Rights and Privacy Act) and COPPA (Children’s Online Privacy Protection Act)—to safeguard sensitive student and staff information.
  • Robust Security Features: Ensure that the technology offers robust security measures—including encryption, secure access controls, and regular security updates—to protect against data breaches and cyberthreats.

Samsung’s WAF Interactive Displays: A Strategic Investment for Modern Education

Samsung’s WAF Interactive Displays offer a compelling blend of features and benefits that make them a strategic investment for schools. Here’s how these displays provide value for the entire school community:

Deliver Interactive and Engaging Lessons

  • Dynamic Learning: Bring lessons to life with vibrant visuals, multimedia integration, and AI-powered personalized learning that adapts content to individual student needs.
  • Interactive Assessments and Content Sharing: Utilize interactive polls, quizzes, and educational games to gauge understanding and provide immediate feedback. Seamless screen mirroring allows content sharing on up to nine devices, ensuring every student is involved.
  • Collaborative Learning: Encourage group work with split note mode, allowing multiple students to write independently in designated zones, fostering both shared and personalized learning.

Streamline Classroom Instruction

  • Easy Multitasking and Effortless Annotation: Teachers can switch between applications using split-screen and annotate in real-time with the annotation button, maintaining lesson flow.
  • Seamless Collaboration and Dual Pen Flexibility: Facilitate group activities with multipoint touch drawing, supporting up to 40 simultaneous users. And switch between pen colors effortlessly with the dual pen feature.

Support Diverse Learning Needs

  • Personalized Learning Paths: Tailor lessons with on-the-fly content adjustments and AI-driven recommendations, ensuring each student receives targeted support.
  • Accessibility Features: Built-in screen readers, adjustable settings, and AI-powered voice recognition promote inclusivity and accessibility for all students.
  • Inclusive Learning: HDMI out allows teachers to share the display on larger screens, ensuring clear visibility and full participation for every student.

Enhance Administrative and Educational Processes

  • Efficient Lesson Planning and Delivery: Access digital resources and lesson plans through the Samsung Smart Classroom platform, streamlining lesson preparation.
  • Advanced Messaging and Simplified IT Management: Send urgent communications instantly via interactive whiteboards and manage displays remotely with a centralized device management solution, ensuring smooth operation and minimal downtime.

Ensure Cost-effectiveness and Scalability

  • Long-term Investment and Durability: Samsung’s WAF displays are built to last with robust construction, minimizing downtime and reducing the need for frequent replacements.
  • Scalability to Meet Growing Needs: Expand your technology infrastructure effortlessly with WAF’s modular design and cloud-based management solutions, allowing for seamless integration across multiple classrooms and schools.

Boost Connectivity and Flexibility

  • Convenient Connectivity: The 3-in-1 USB-C port supports screen mirroring, touch control, and device charging, facilitating hassle-free interactive lessons.
  • All-in-one Functionality: The OPS slot transforms the display into a Windows device, eliminating the need for an extra PC and providing access to familiar computer features directly on the display.

Robust Security and Reliable Support

  • Data Privacy and Security: Samsung’s WAF Interactive Displays incorporate robust cybersecurity measures, including encryption and secure access controls, ensuring compliance with regulations like FERPA.
  • Comprehensive Technical Support: Benefit from responsive support teams, training programs, and extensive documentation. Proactive maintenance and regular software updates ensure seamless operation and minimal disruptions.

Invest in the Future of Education with Samsung’s AI-Powered WAF Interactive Displays

Interactive displays are a strategic investment that enhances engagement, supports diverse learning needs, and streamlines administrative tasks. Samsung’s AI-powered WAF Interactive Displays offer a comprehensive solution that surpasses other classroom technologies by integrating multiple functionalities into one intelligent platform. Their advanced features improve engagement, simplify administrative tasks, and provide scalable, secure solutions that grow with your district’s needs.

Ready to elevate your classrooms? Reach out to learn more about how Samsung’s WAF Interactive Displays can meet your district’s unique needs and drive educational excellence.

]]>
TechSperience Episode 134: AIOps – The... https://community.connection.com/techsperience-episode-134-aiops-the-future-of-threat-detection-and-response/ Feb 18, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/02/2926721-TechSperience-Ep134-AIOps-BLOG.jpg

Artificial intelligence for IT security operations (AISecOps) is revolutionizing cybersecurity. In this episode, we discuss how AI and machine learning are used to analyze IT data, allowing organizations to:

  • Proactively identify and respond to threats: Detect anomalies, predict outages, and automate incident response.
  • Improve efficiency: Streamline security operations and optimize resource allocation.
  • Enhance threat detection: Uncover complex attack patterns and stay ahead of emerging threats.

We'll cover real-world applications of AISecOps, the challenges in implementing this technology, and future trends in AI-driven security. Finally, we'll provide actionable insights for organizations looking to strengthen their security posture with AISecOps.

Speakers: John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection
Kimberlee Coombes, Security Solution Architect, Connection

Show Notes: 

00:00 Introduction to AIOps and Cybersecurity
03:07 The Role of AIOps in Threat Detection
06:01 Adapting to Evolving Cyber Threats
08:58 AI in Proactive Threat Hunting
11:55 Challenges in AIOps: False Positives and Data Quality
14:51 The Future of AIOps and Self-Healing Systems
17:45 AI and Zero Trust Strategies
21:07 AIOps for Small and Medium Businesses
23:52 Final Thoughts and Recommendations

]]>
Top Trends in Manufacturing Factory... https://community.connection.com/top-trends-in-manufacturing-factory-modernization-for-2025/ Feb 13, 2025 Ryan Spurr https://community.connection.com/author/ryan-spurr/ https://community.connection.com/wp-content/uploads/2025/02/2924748-Manufacturing-Trends-BLOG.jpg

The manufacturing industry is undergoing a practical transformation—driven by the necessity for improved efficiency, enhanced security, and a future-ready infrastructure. While there is a lot of excitement surrounding artificial intelligence, the core investments Connection sees manufacturing customers make is in the modernization of their factories and critical business processes as a steppingstone for what comes next.

According to Connection’s Biannual Manufacturing Survey, 98% of manufacturing leaders believe modernization is crucial1 for differentiation and growth. Without these investments, businesses risk falling behind and becoming unable to execute their strategies effectively.

Networking: The Digital Backbone

Networking forms the fundamental layer of factory modernization. The integration of factory devices, industrial control systems, robotics, and sensors requires a robust and scalable networking infrastructure with cybersecurity layered throughout. This connectivity enables real-time data collection and analysis, pivotal for optimizing production processes. According to Cisco, connected factories see a 21% increase in operational efficiency—so there’s no doubt why 46% plan to use OT data to improve quality, while 43% will optimize their processes.

High-speed, low-latency networks are essential to support the vast amount of data generated by modern manufacturing equipment and systems. It’s also about networking designed to support industrial environments and their unique challenges. This includes networking designed with zero trust, micro-segmentation, real-time monitoring of traffic, industrial deep packet inspection, asset and route visibility, and the ability to detect and alert on anomalous activity in the environment. Oh…and of course, these networks also need to support the diverse heterogeneous manufacturing technologies that drive plant facilities, production equipment, and a broad range of functional stakeholders, making the value engine of manufacturing operate.

Cybersecurity and Industry Security Compliance

With increased connectivity comes an escalated need for robust cybersecurity measures. The manufacturing sector has become a prime target for cyberattacks, making it the number one most attacked industry. In addition to the threats themselves, the impacts on businesses have also increased, turning cybersecurity from an “it may happen” risk avoidance scenario to a hard financial impact that requires attention.

In our recent customer survey, 59% of manufacturers experienced higher cybersecurity insurance premiums, 27% were struggling to comply with existing or new insurance requirements, and 14% were dropped altogether.2 On the regulation front, new financial regulatory mandates require publicly traded companies to report incidents to the market, bringing greater investor scrutiny and public visibility to the threats most manufacturers face. We are also seeing investment into new cybersecurity capabilities related to standards such as the NIST, ISO, and CMMC Cybersecurity Frameworks. These standards ensure that data protection and privacy measures are consistently applied, reducing the risk of breaches, assisting with cybersecurity insurance, and a key focus with investment stakeholders.

As a result of the increasing impact and focus on cybersecurity, manufacturers are rethinking traditional operational technology norms and the use of cybersecurity within these domains. This is driving manufacturers to adopt comprehensive cybersecurity strategies that include asset visibility, flow monitoring, industrial deep packet inspections, firewalls, regular security audits, and new sensor technology that monitors both IT and industrial packets flowing at lower levels within the Purdue model.

Composable Architectures

Composable architecture refers to a modular approach to system design, where components can be easily configured and reconfigured to meet changing demands. This flexibility is vital for manufacturing plants that need to adapt quickly to new products or changes in market conditions. According to Gartner, by 2023, 60% of new digital business solutions will rely on composable architecture.3 This trend allows manufacturers to innovate rapidly without the constraints of rigid, monolithic systems.

Third-party Remote Access

The ability to provide remote access to remote employees, third-party vendors, and service providers is becoming increasingly important. This capability enables experts to troubleshoot issues, perform maintenance, and update software without being on-site. It’s also not a new concept, with various legacy approaches to third-party access that provide external access to targeted devices and often entire networks, increasing the potential security risks in a predominantly at-risk environment. 51% of organizations have experienced data breaches caused by third-party remote access and 66% have not implemented least privileged access.4

To mitigate these risks, manufacturers should implement modern secure remote access solutions. This includes zero-trust network access across the various layers of the Purdue model to provide methodical integration with low-level industrial control networks and devices. It should also include an inventory of all third parties, auditing of all transactions with fine details, minimizing how files are exchanged, and a means to speed compliance with auditing regulations.

Readying the Factory for Future Technologies

A forward-thinking approach involves preparing the factory for future technological advancements. This readiness includes integrating data from various sources, automating processes, and incorporating AI capabilities.

Data Integration

Data integration is the process of combining data from different sources to provide a unified view. This integration is critical for enabling advanced analytics and informed decision-making. According to Accenture, manufacturers that leverage integrated data can improve business growth by 30 percent per year.5 Data lakes and cloud storage solutions are often employed to facilitate this integration, allowing for scalable and flexible data management, but many are also deploying new strategies around data orchestration from low-level plant operations to business systems of record to cloud platforms. Tapping into the data-rich environments will fuel manufacturers’ ability to gain a competitive advantage.

Automation

Automation is no longer a futuristic concept but a current necessity. By automating repetitive tasks, manufacturers can reduce human error, increase production speed, and free up employees for more complex tasks. For example, industrial companies are expected to invest 25% of their capital in industrial automation over the next 5 years.6 Automation technologies include robotic process automation (RPA), autonomous mobile robots (AMRs), and advanced manufacturing execution systems (MES). Our teams are also seeing other technologies—like Agentic AI—transform how actions are taken on behalf of employees, customers, or AI atop of AI insights or triggers.

Artificial Intelligence

AI is transforming the manufacturing landscape by enabling predictive maintenance, quality control, and supply chain optimization. Predictive maintenance uses AI algorithms to predict when equipment is likely to fail, allowing for timely interventions that prevent costly downtime. In quality control, AI systems can detect defects more accurately and quickly than human inspectors. For supply chain optimization, AI can analyze vast amounts of data to forecast demand, optimize inventory levels, and improve logistics efficiency.

Two popular use cases that align to the growing demand for GenAI and LLMs is small language models (SLMs)—trained on data-rich manufacturing environments and optimized with impactful prompt engineering to make it easier to identify risks and opportunities and aim organizations towards continuous meaningful improvements—and (eventually) agentic AI, where action is taken based upon human and AI engagement.

Conclusion

The modernization of manufacturing factories is not just a trend but a critical necessity for staying competitive in today's market. Manufacturers can achieve significant gains in efficiency, security, and flexibility by investing in networking, cybersecurity, composable architectures, and future-ready technologies. These investments are essential for differentiation and growth, enabling manufacturers to execute their business strategies successfully and remain resilient in the face of evolving challenges.

Engage our Manufacturing Practice today to learn how to get started!

1 https://www.connection.com/media/magpoy4u/cnxn-manufacturing-ot-cybersecurity-market-pulse-survey.pdf

2 https://www.connection.com/media/magpoy4u/cnxn-manufacturing-ot-cybersecurity-market-pulse-survey.pdf

3 https://www.gartner.com/en/doc/465932-future-of-applications-delivering-the-composable-enterprise

4 https://security.imprivata.com/rs/413-FZZ-310/images/IM_Report_Third-Party-Remote-Access-Security.pdf

5 https://www.accenture.com/nl-en/blogs/insights/data-driven-enterprise

6 https://explodingtopics.com/blog/robotics-industry-stats


]]>
Future Ready, Quality Outcomes with... https://community.connection.com/future-ready-quality-outcomes-with-connection-at-himss-2025/ Feb 11, 2025 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2025/02/2932971-HIMSS25-BLOG.jpg

This is the time of year when many organizations host their company kickoffs, sales kickoffs, and partner and industry events. Among them, HIMSS 2025 returns to Las Vegas hosted by the Venetian Convention and Expo Center, Caesar’s Forum and Wynn Las Vegas on March 3–6.

With more than 30,000 people expected from the healthcare payor, provider, and partner ecosystem, this is an event that can easily overwhelm a first-time attendee. By partnering with Connection during our 19th year as Diamond Sponsors, we can help make this week exceptionally valuable to you and your organization, whether this is your first year attending or one of many times.

Partnership: Existing and Emerging

HIMSS publishes a list of exhibiting partners to their public facing domain and keeps it current throughout the approach to the conference. Work with your Connection team to curate your partner meeting lineup and make your time at HIMSS 2025 more effective. This is a great opportunity to meet with your strategic application providers to align to your goals in the year ahead. Many partners will make important announcements and present new collaborations during HIMSS. It’s great to learn this information in real-time.

Continuing Education

Connection has several team members who hold their CAHIMS or CPHIMS certifications. Other professional and clinical attendees may earn CE credits for senior healthcare executives (ACHE), physicians (CME), nurses (CNE), pharmacists (ACPE), clinical dietitians (CDE), psychologists (APA), social workers (ASWB), project managers (PMP and others), health information management professionals (AHIMA), information security professionals (IAPP), cybersecurity professionals (GIAC), certified imaging informatics professionals (CIIP), and those seeking contact hours towards the renewal of CAHIMS®, CPHIMS®, or CPDHTS®. It can be difficult for individuals to keep current with their continuing education and HIMSS does a great job offering accredited content throughout the event.

Public Policy

With a new congress, newly inaugurated President Trump, and new cabinet, HIMSS 2025 offers a public policy track that will provide up-to-the-minute information on changes to regulations, governance, interagency news, and newly formed government departments under the new administration. Expect cybersecurity and artificial intelligence to remain at the forefront of these sessions, as well speakers from the current and former Trump administration leaders, including Seema Verma, former administrator of the Centers for Medicare and Medicaid Services (CMS), the largest payer in the world, between 2017–2021. Payor and provider organizations looking for a deep understanding of healthcare policy reform should invest some time in these sessions.

Networking

HIMSS 2025 is a reunion for those of us who have been attending over the last decade or more, an opportunity to share our clients’ success, and reaffirm the partnership that are our guideposts in delivering exceptional patient care. We look forward to hosting you in Booth #3907 to discuss all things healthcare IT.

Abacode | Microsoft

MCCS Powered by Abacode + Microsoft Security offers a tailored cybersecurity and compliance solution for healthcare. This partnership enhances security, ensures compliance with critical regulations like HIPAA and HITRUST, and simplifies operations. With 24x7 monitoring and proactive support, MCCS Powered by Abacode + Microsoft Security enables healthcare providers to protect patient data and focus on delivering exceptional care.

Apple

Groundbreaking Ways to Deliver Healthcare. When healthcare providers have powerful, intuitive tools for delivering care, they can work more effectively in hospitals, connect remotely with patients, and conduct groundbreaking medical research. Apple devices and healthcare apps are bringing more efficiency—and more connection—to the ways that medicine is practiced, care is delivered, and patients are engaging with their health.

Dell Technologies

Dell Technologies understands that the role of information security professionals and technology is that of many dimensions. Information security is no longer separate from IT operations or daily organizational operations. An organization’s information security posture has become a symphony; it takes every tool and every skill to maintain harmony. Dell Technologies’ focus is the intersection of people, processes, and technology—and the best application of all via mission-defined objectives. Security challenges will impact all organizations. Dell Technologies is a leader in protecting your data, protecting the integrity of your business operations, and being your trusted partner—from the endpoint to the vault and back.

Google Chrome | Intel

Chromebook Plus with Intel® Core™ processors deliver 2x the power, speed, memory, and storage of other devices1 that empower healthcare staff to do their best work. All devices run on ChromeOS, the most secure OS out of the box2, which protects employees from security breaches, and organizations’ bottom-line profitability from costly attacks. ChromeOS also integrates seamlessly with leading EHR vendors, virtualization partners, and preferred peripherals of leading healthcare organizations.

Let’s Shape the Future of Healthcare Together at HIMSS 2025!

 As we gather at HIMSS 2025, let’s embrace the opportunity to shape the future of healthcare together. Visit Connection in Booth #3907 to discover how our healthcare solutions and services optimize workflows, secure sensitive data, and elevate the patient and provider experience.

Schedule a meeting with the Connection team in Booth #3907 during HIMSS 2025.

1. Atredis Partners Google ChromeOS Competitive Analysis, 2024)

2. IDC InfoBrief, sponsored by Google, Deploying ChromeOS for Better Security Outcomes, doc

]]>
OT Cybersecurity Can No Longer Be Ignored https://community.connection.com/ot-cybersecurity-can-no-longer-be-ignored/ Jan 23, 2025 James Rust https://community.connection.com/author/james-rust/ https://community.connection.com/wp-content/uploads/2025/01/2908216-OT-Cybersecurity-Blog-BLOG.jpg

The threat of cyberattacks is escalating at a terrifying rate. A staggering 84% of businesses surveyed experienced at least one successful cybersecurity incident in 2023, a significant jump from 60% in 2022.1 90% of those surveyed ranked operational technology (OT) cybersecurity risk to be moderate to severe, and 79% of IT professionals believe an attack is likely to come in the next year.1

An increasing number of devices in manufacturing are directly on networks thanks to the widespread adoptions of industrial IoT. When visiting manufacturing facilities, we see departments other than IT have purchased devices that end up on the network and aren’t properly protected. Unfortunately, this can mean more avenues of attack for hackers who want to steal your data and hamper your ability to operate in exchange for a hefty ransom.    

So what’s to be done? You could assume that if you find a solid insurance plan, you’ll be protected, but as attacks increase, insurance providers have become more stringent with their requirements. 84% of organizations reported difficulties finding or qualifying for coverage, and 43% have attributed it to their cybersecurity setup.1 A more proactive and robust approach to cybersecurity is now a necessity.  

Knowing Is Half the Battle!

I’ve encountered a surprising number of factories that have devices on the network that IT is completely unaware of. In one instance, we even discovered an unknown cellular gateway operating without any oversight or security measures. The first step in ensuring there are no holes in your security is gaining a complete understanding of what’s on your network.

Addressing this challenge may seem daunting, but it doesn’t have to be. Connection’s Manufacturing OT Cybersecurity Assessment provides a thorough evaluation of your current network environment. This is usually done by first remotely reviewing your OT/IT Environment, what work needs to be done, and how it should be bounded. We’ll also look at your cybersecurity needs, objectives, and any other concerns you might have. 

After the remote review, we’ll come onsite and deploy specialized devices to your network to scan for all connected devices and their communication patterns. This is typically done passively, so your operation will not be affected in any way. By running the software, we’ll gain a complete inventory of your network assets, including their location, communication flows, and potential vulnerabilities. While we are onsite, we will physically tour your facility to gain a better understanding of the operational context of your devices.

Actionable Insights Will Be Yours

Upon completion of the scan, we will deliver a comprehensive report detailing our findings. This report will include a complete inventory of all network devices and components, classified by location, type, and other relevant parameters. A detailed network map will visualize all devices and their communication pathways. Each device will be assigned a risk score, along with clear justifications for the assigned rating. Based on the risk assessments, we will provide specific recommendations to enhance your network security. This may include device reconfiguration, software updates, hardware upgrades, and architectural modifications to your network. This report will serve as a roadmap for implementing a robust security posture, enabling you to proactively address vulnerabilities and fortify your critical infrastructure.

You Deserve the Best

The increasing complexity and connectivity of modern manufacturing environments has led to numerous vulnerabilities in OT cybersecurity. The bad news is that bad actors have caught on to this fact and have begun targeting operational technology, but the good news is that there are now ways you can address it so that you are protected. If you’re concerned about your cybersecurity, engage our Manufacturing Practice at Connection today. We’ll be happy to come visit your site and conduct a free inspection of your network so that you can know you have the best protection available.

https://www.connection.com/media/magpoy4u/cnxn-manufacturing-ot-cybersecurity-market-pulse-survey.pdf

]]>
TechSperience Episode 133: From Chaos to... https://community.connection.com/episode-133-from-chaos-to-clarity-ai-security-tools-at-work/ Jan 07, 2025 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2025/01/2900990-Techsperience-Episode-133-Blog-BLOG-1.jpg

In this episode, our Security Center of Excellence team delves into a real-world cybersecurity mystery and its unexpected solution. This incident is a perfect storm, highlighting the complexities of modern IT environments and how seemingly unrelated actions can cascade into significant business disruptions and major security incidents. But here's where it gets fascinating: what appeared to be a sophisticated cyber-attack turned out to be something far more mundane, yet equally dangerous.

The hero of this story is an AI-powered assistant, still in its proof-of-concept phase, that cracked the case. Join us as we unravel the mystery of the Phantom Brute Force attack and the AI detective that solved it. This isn’t just a mere cautionary tale about the complexities of modern IT environments; it serves as a stark reminder of how easily security can be compromised in unexpected ways.

Speakers: John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection
Kimberlee Coombes, Senior Security Architect, Connection

Show Notes:
00:00 Introduction to Cybersecurity Challenges
01:02 The Incident Unfolds: A Case Study
03:51 Utilizing AI Tools for Incident Resolution
10:29 The Role of AI in Cybersecurity
15:35 Bridging Theory and Practice in Cybersecurity
18:00 Best Practices: Hard-Coded Passwords and Change Management
23:43 Empowering Users with AI Tools

]]>
Embracing Technology Trends in 2025:... https://community.connection.com/embracing-technology-trends-in-2025-navigating-talent-challenges-and-diversifying-hiring-strategies/ Jan 06, 2025 Patrick Dja Konan https://community.connection.com/author/patrick-dja-konan/ https://community.connection.com/wp-content/uploads/2024/12/2888937-IT-Trends-Blog-Post-Name-BLOG-1.jpg

As we move into 2025, companies are increasingly focused on integrating cutting-edge technology trends to stay competitive and drive innovation. However, the rapid pace of technological advancement brings significant challenges—particularly in finding IT talent with the specialized skillsets needed to implement these trends effectively. 

Several technology trends are set to shape the business landscape in 2025. According to Gartner, notable trends include agentic AI, post-quantum cryptography, hybrid computing, and spatial computing. Fast Company highlights the importance of sustainability, platform selectivity, and overall digital product ecosystem. These technologies promise to enhance efficiency, drive innovation, and create new business opportunities. 

Despite the exciting potential of these technologies, companies still face significant hurdles in sourcing the right talent. The demand for skilled professionals in areas such as AI, machine learning, and cybersecurity far exceeds the supply. A report by Robert Half indicates that 90% of hiring managers in tech and IT face challenges finding skilled candidates. This talent shortage is worsened by the rapid pace of technological change and the evolving skill requirements. 

To address these challenges, companies are increasingly diversifying their hiring strategies by combining full-time employees with IT consultants. This approach offers several benefits: 

  1. Flexibility and Scalability: IT consultants provide the flexibility to scale teams up or down based on project needs. This is particularly valuable for short-term projects—or when specialized expertise is required temporarily. 
  1. Access to Specialized Skills: Consultants often bring niche expertise that may not be available in-house. This can accelerate the implementation of complex technologies and ensure projects are completed efficiently. 
  1. Cost-effectiveness: Hiring consultants can be more cost-effective than maintaining a large full-time staff, especially for specialized roles that are not needed on a permanent basis. 
  1. Enhanced Innovation and Creativity: A diverse workforce, including both full-time employees and consultants from various backgrounds, fosters creativity and innovation. Diverse teams bring different perspectives, leading to better problem solving and decision making. 
  1. Improved Employee Retention: Offering a mix of full-time and consulting opportunities can improve job satisfaction and retention. Employees appreciate the flexibility and opportunities for professional growth that come with working alongside consultants. 

Shaping the Future of IT Hiring

As companies navigate the complexities of implementing new technology trends in 2025, diversifying hiring strategies to include both IT consultants and full-time employees will be crucial. This approach not only addresses the talent shortage but also enhances innovation, flexibility, and overall business performance. By leveraging the strengths of a diverse workforce, organizations can better position themselves for success in the rapidly evolving tech landscape. As a leading IT partner, Connection offers contract, contract-to-hire, fulltime, and payrolling options to help organizations navigate talent challenges and diversifying hiring strategies.  

]]>
2025 Healthcare IT Trends: What’s Next for... https://community.connection.com/2025-healthcare-it-trends-whats-next-for-patient-centered-care/ Dec 19, 2024 Jennifer Johnson https://community.connection.com/author/jennifer-johnson/ https://community.connection.com/wp-content/uploads/2024/12/2883549-2025-Healthcare-IT-Trends-Blog.jpg

As we close 2024 and look toward 2025, I begin this post filled with overwhelming gratitude to our clients and partners, all of whom in very specific ways are part of patient-centered care. Together we build solutions that innovate, retain healthcare employees, grow IT talent, and contain costs for our healthcare providers across the continuum of care. 

As we look ahead to 2025, the landscape of healthcare and technology will continue to evolve rapidly, shaped by shifting political dynamics, groundbreaking innovations, and the ongoing challenges of delivering patient-centered care.

A New(ish) President, A New Congress

When President-elect Donald Trump is sworn in for his second term on January 20, 2025, it’s widely believed that one of his first of more than 200 executive orders will be to rescind President Biden’s October 30, 2023 Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence, eventually replacing it with one of his own. Now that the healthcare industry has had several years of Artificial Intelligence or AI adjacencies in both patient care and administrative functions, the debate around AI and its use will become more nuanced. We’ve spent the last few years in a caveman-like dialogue “AI good. No. AI bad.” 2025 will pressure test what we’ve learned and reveal blind spots as AI becomes more embedded, less opt-in, and included in how we deliver patient care, starting with the Electronic Health Record. While Judy Faulkner, Michelle O’Conner, and David Feinberg are known to those of us working in Health IT, expect That One Uncle of Yours to name drop them in the same sentence with Jensen Huang, Elon Musk, Tim Cook, and Sundar Pichai.

Virtual Desktops

I’ve written about this before, but I predict that 2025 is the year you’ll see widespread use of virtual desktops in healthcare, particularly at bedside. Though the primary driver for this will be related to rising software licensing costs, ease of management, and improved security, the DOJ Antitrust lawsuit against Alphabet compelling the sale of Google Chrome is an interesting twist. This is going to “hit different” for the ironic t-shirt wearing Millennials and Gen Zer’s, cloud-first generations that grew up using Chromebooks in school. Now Healthcare IT decision makers themselves, the DOJ remedy may remove some overall endpoint inertia we’ve seen since 2020 as the threat of “losing” Chrome makes it more valuable to that persona group. That said, the DOJ remedy might not stick under the new administration and Google will likely file its own remedy. Looking beyond Chrome OS, both IGEL and Azure Virtual Desktop have strong healthcare case studies, and the timing seems right for greater adoption of these solutions in healthcare.

Virtual Care

It’s been 5 years since the Covid 19 pandemic transformed healthcare. One of the more lasting technology solutions from that time is telehealth and the positive impact that virtual care has had for patients, families, and care teams. It’s allowed the care teams greater reach, driven adoption of smart hospital rooms, and led to improved HCAHPS scores. Though the federal policies that provide reimbursements are set to expire on 12/31/2024, there’s support and a strong bi-partisan appetite to have these benefits extended. I expect to see an increase in the number of telehealth projects and some creative applications for ways that healthcare providers seamlessly blend virtual care with AI virtual assistants.

SaaS Applications/Cloud Usage

During the November CHIME Fall Forum in San Diego, I had a healthcare CIO confide that they were using more than 300 SaaS-based applications. Seeing the expression on my face, they offered up that while the cloud was still great for things like low upfront cost, speed to deploy, ability to scale—and that that these were benefits more keenly realized when cloud adoption was new and there were fewer than 25 agreements in place. Now more than 15 years into our collective cloud journey, OPEX concerns, security vulnerabilities, and egress fees have left healthcare organizations with a sprawling, often disjointed, poorly rationalized cloud strategy. This is true for many large organizations, but healthcare, uniquely stymied by stagnant reimbursements and overrun by inflation on med-surge equipment, is uniquely impacted. Expect some new healthcare stakeholders to start asking, “We’re paying SaaS Partner A how much a quarter?” This could lead toward a different negotiation strategy with the partner community, overall solution consolidation, or selective use of on-prem/hybrid cloud model. No COO should be forced to choose between paying their SaaS provider and buying a life-saving medical device.

As we step into 2025, the opportunities and challenges facing healthcare IT are both exciting and complex. From navigating policy shifts and advancing AI adoption to reimagining virtual care and cloud strategies, the decisions we make today will shape the future of patient-centered care. Together with our clients and partners, we remain committed to building innovative, sustainable solutions that empower healthcare providers across the continuum of care. Here's to a year of progress, collaboration, and impact. Connection’s healthcare IT experts are here to help you find, customize, and implement the right technology to achieve your organization’s goals. Learn more about our Healthcare Solutions and Services or contact your Connection Account Team today.

]]>
Prepare for Upcoming Microsoft CSP Licensing... https://community.connection.com/prepare-for-upcoming-microsoft-csp-licensing-and-pricing-changes-with-connection/ Dec 17, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/12/2888424-Prepare-for-Upcoing-Microsoft-CSP-Licensing-BLOG.jpg

Microsoft has announced changes to its Cloud Solution Provider (CSP) program that will affect licensing and pricing structures for online services. These updates, taking effect in late 2024 and early 2025, require organizations to adjust their purchasing strategies and budgets.

As a trusted Microsoft partner, Connection is here to guide you through these changes and help you optimize your licensing for long-term success.

1. New Monthly Billing Option for Microsoft 365 Copilot

Effective December 1, 2024, Microsoft has begun offering a new annual subscription term with a monthly billing option for Microsoft 365 Copilot in CSP. This option introduces more flexible payment terms compared to the current annual term with annual billing. Note: The new option comes with a 5% premium over the existing annual term/annual billing pricing.

2. Price Increase for All Annual Term/Monthly Billing CSP SKUs

Effective April 1, 2025, Microsoft will increase pricing by 5% for all annual term/monthly billing online services SKUs; this applies to both renewals and new subscriptions taking place on 4/1 and after. Previously, organizations paid the same rate for annual and monthly payment options. This change adds a cost differential.

How Will These Changes Impact Your CSP Environment?

Organizations relying on monthly billing cycles may face notable budget increases. For example, a company using Microsoft 365 E3 for 500 users on a monthly billing cycle could see costs rise by an estimated $10,000 annually. These changes emphasize the need for strategic planning and budgeting to minimize disruptions and optimize expenditure.

As a Microsoft Cloud Solutions Provider (CSP), Connection delivers unmatched support to organizations investing in cloud technologies and solutions from Microsoft. CSP with Connection isn’t just about licensing—it’s about partnership. We’re committed to helping you optimize your cloud journey by:

1. Comprehensive Licensing Reviews—Our Microsoft specialists will analyze your current licensing agreements to identify potential savings. We’ll assess your usage, detect redundancies, and recommend adjustments to right-size your licenses under the new pricing model.

2. Budget Planning and Forecasting—Connection’s consultants will work with you to develop a proactive budget strategy that incorporates these pricing changes. We aim to help you avoid surprises and maintain financial control while adapting to the new structure.

3. Ongoing Support and Guidance—Whether you choose to switch to annual billing or maintain monthly payments, our team will assist in navigating the transition. We provide continuous support to adapt your licensing to future changes with minimal disruption.

At Connection, we understand the challenges of evolving licensing models and pricing structures. Our expertise ensures a smooth transition and long-term success for your organization. Don’t wait to plan for these changes. Contact us today to schedule a consultation and explore how we can help optimize your Microsoft environment.

]]>
TechSperience Episode 132: Frontline... https://community.connection.com/techsperience-episode-132-frontline-cybersecurity-trends-impacts-and-zero-trust-insights/ Dec 10, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/12/2880733-TechSperience-Ep131-Frontline-CyberSecurity-Blog-BLOG.jpg

In the ever-evolving world of cybersecurity, staying ahead isn't just an option—it’s a necessity. In this episode, we peel back the layers of the latest attack vector trends and take you into the heart of a recent cyber incident that challenged conventional defenses. This isn’t just about technology; it’s about the human element—everyday people navigating an invisible battlefield. 

We’ll break down the anatomy of a sophisticated malware attack that slipped past traditional security measures with ghost-like precision. Discover the tools, strategies, and decisions that led to its eventual detection, containment, and remediation. 

Beyond the technical deep dive, we’ll explore the critical role of Zero-Trust principles in building resilient defenses and highlight how fostering a culture of awareness and vigilance can be the ultimate game-changer. Whether you’re an industry veteran or just starting your cybersecurity journey, this episode is packed with insights and actionable takeaways to fortify your defenses and stay ahead of emerging threats.

Speakers: John Chirillo, Principal Security Architect, Connection
Rob Di Girolamo, Senior Security Architect, Connection
Pam Kennedy, Senior Cybersecurity Engineer, Connection
Kevin Knapp, Senior Cybersecurity Engineer, Connection

Show Notes: 

00:00 Introduction to Cybersecurity Trends

02:50 Ransomware Evolution and Tactics

06:07 AI’s Role in Cyber Threats

09:01 Critical Infrastructure Vulnerabilities

11:52 Supply Chain and Vendor Attacks

15:11 Identity-based Attacks and Authentication Challenges

18:05 Key Takeaways for Organizations

20:57 Case Study: The Wave Browser Incident

27:12 Post-incident Analysis and Lessons Learned

Learn More in Frontline Cybersecurity: Trends, Impacts, and Zero-trust Insights - Companion Presentation.

]]>
TechSperience Episode 131: Securing Industry 4.0 https://community.connection.com/techsperience-episode-131-securing-industry-4-0/ Dec 04, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/12/2867969-TechSperience-SecuringIndustry-BLOG.jpg

The manufacturing sector is racing towards a digital future, but this innovation brings an explosion of cyber threats. This episode exposes the hidden vulnerabilities putting manufacturers at risk and reveals essential strategies to protect your operations. Discover how to secure vulnerable legacy systems, maximize your cybersecurity insurance, and gain critical visibility into Operational Technology (OT).

We'll also explore how Microsoft and AI are revolutionizing cybersecurity defense. Don't let cybercriminals cripple your progress. Tune in now for actionable insights and expert advice to fortify your manufacturing business against evolving threats.

Host:

James Hilliard

Guests:

Ryan Spurr, Manufacturing Strategy Director, Connection
Travis Guinn, Principal Partner Solutions Architect for Security, Microsoft

Show Notes:

00:00 Introduction to Industry 4.0 Security Challenges
03:03 Assessing Current Security Postures in Manufacturing
05:54 Cybersecurity Insurance and Its Impact on Manufacturers
09:12 The Need for Visibility and Insight in Manufacturing
12:02 Modernization and Integration of IT and OT
14:52 Decision-Making in Cybersecurity Modernization
17:49 Leveraging Microsoft Tools for Cybersecurity
21:02 The Role of AI in Enhancing Cybersecurity
25:46 Conclusion and Next Steps for Manufacturers

]]>
Cyber Threats in Education and What to Do... https://community.connection.com/cyber-threats-in-education-and-what-to-do-about-them/ Nov 21, 2024 Pam Aulakh https://community.connection.com/author/pam-aulakh/ https://community.connection.com/wp-content/uploads/2024/11/2856377-Cyberthreats-Education-BLOG.jpg

The top target for ransomware attacks isn’t the healthcare industry or the critical infrastructure as many may assume. It is education, particularly K–12 institutions. Different studies found a dramatic increase in the number of attacks against schools, jumping from 129 in 2022 to 265 in 2023.1,2

The education sector is a data treasure trove of personal information belonging to students, educators, parents, and alumni—ranging from Social Security numbers to credit cards numbers. The PII of children is especially attractive to threat actors because no one is running credit checks or using that information until they want a driver’s license or try to rent an apartment after graduating. By then, unfortunately, their identity could be compromised an untold number of times.

The Reasons Behind the Rise in Cyberattacks against Schools

The vast amount of valuable data is why schools have shot up to number one on threat actors’ target lists, but it isn’t the only reason why education is seeing an increase in cyberattacks.

There is a greater reliance on technology even for the youngest children. Students are given computers and tablets to use both in school and at home—adding thousands of devices to the network and thousands of users who are untrained in basic cybersecurity hygiene.

School districts also traditionally have understaffed IT and cybersecurity teams, often spread out across different schools. Many have an aging infrastructure, poorly equipped to handle the more sophisticated and faster technology used by students and faculty.

Today’s school children live online, and threat actors know it. Social engineering tactics lure kids to watch videos on YouTube or TikTok and phishing emails lure them into making mistakes that launch malware into the school network.

Should Schools Pay Ransomware Payments?

Nearly half of schools hit by ransomware have paid to recover their data.3 But should they?

No, say the FBI and CISA. Payment doesn’t guarantee data will be released, and threat actors are increasingly holding data for multiple ransoms even after they are paid. There is also concern of subsequent ransomware attacks if the threat actors know they’ll get paid.

However, because of the nature of the data that is stolen, there are many district administrators who think the chance of recovery is worth the payment.

Educational Institutions Respond

Thanks to a number of government resources available, schools have access to the information needed to respond to cyber threats.4 This includes:

  • User cybersecurity education for all levels. There should be regular security awareness training for teachers and staff. In the classrooms, children should have grade-appropriate education about what they should and should not do on their devices.
  • End point protection. All devices and applications should require MFA or biometric authentication. Cybersecurity software that offers secure gateways, firewalls, and cloud security should be available.
  • Third-party services. Using MSSPs to manage the infrastructure and security issues like updates and patches covers the gaps of a small IT staff.

Addressing Security in the Age of AI

AI is creating new security risks for education, and school leaders are unsure how to address the increased use of generative AI by students and faculty. They aren’t alone; all industries are grappling with how to use AI in secure ways, but education is the only industry where the majority of users are under 18. How to handle privacy concerns or misinformation in generative AI is something that will need to become a top issue as schools continue to discuss their overall cybersecurity plans.

Cyberattacks are on the rise in the education sector, targeting the most vulnerable members of society. Schools must take greater steps to address the threats, as well as the immediate and future impacts of a data breach on students. 

How Connection Can Help

Connection is your partner for cybersecurity solutions and services. From hardware and software to consulting and customized solutions, we’re leading the way in education cybersecurity and solutions. 

Explore our Solutions and Services

Cybersecurity

K-12 Education Technology

Modern Infrastructure

Reach out to one of our Connection experts today:

Contact Us
1.800.998.0067

___

  1. Sophos, The State of Ransomware in Education 2023
  2. ThreatDown, 2024 State of Ransomware in Education: 92% Spike in K-12 Attacks
  3. K12 Dive, Nearly Half of K-12 Providers Hit by Ransomware Paid to Have Data Restored
  4. REMS, Cybersecurity Preparedness for K-12 Schools and Institutions of Higher Education

]]>
TechSperience Episode 130: AI in... https://community.connection.com/techsperience-episode-130-ai-in-manufacturing-productivity-at-the-point-of-use/ Nov 20, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/11/2846925-TechSperience-AIManufacturingProductivity-BLOG.jpg

Manufacturing is undergoing a quiet transformation as AI becomes embedded across the industry, often without companies fully realizing it. Despite abundant data, many manufacturers are missing chances to boost productivity and quality control. With mature tools like Vision AI gaining traction, the future of manufacturing will rely on intelligent automation, edge computing, and AI as a workforce enhancer rather than a replacement.

In this episode, we delve into the current and future role of AI in manufacturing, highlighting practical applications and the impact on efficiency. We’ll cover how manufacturers can better leverage data, embrace embedded AI, and foster collaboration to successfully implement AI solutions, emphasizing the importance of continuous learning in this evolving space.

Host:

James Hilliard

Guests:

Ryan Spurr, Senior Manufacturing Strategy and Business Development Director at Connection

Jeff Minushkin, VP Helix Core Engineering

Show Notes:

00:00 Introduction to AI in Manufacturing
02:52 Current State of AI Adoption in Manufacturing
06:05 The Future of AI: Trends and Innovations
09:03 Identifying Wins: Productivity and Quality Control
11:51 Technological Changes on the Horizon
14:47 Impact on Workforce: Operators, Engineers, and Managers
20:50 Engagement Process: From Strategy to Execution
26:57 The Path Forward with AI in Manufacturing

]]>
Understanding the Impact of SQL Server 2022... https://community.connection.com/understanding-the-impact-of-sql-server-2022-licensing-changes-and-how-connection-sam-services-can-help/ Nov 19, 2024 Seth Mitchell https://community.connection.com/author/seth-mitchell/ https://community.connection.com/wp-content/uploads/2024/11/2852223-Microsoft-SQL-Server-BLOG.jpg

In November 2022, Microsoft quietly implemented significant changes to their SQL Server licensing rules, particularly affecting virtual SQL servers. These updates, including a notable price increase and new requirements for Software Assurance (SA) for virtual environments, could have a profound impact on your organization’s IT budget and strategy. As IT executives, it’s crucial to stay informed about these changes and understand how they affect your licensing and compliance landscape. This blog aims to provide an overview of the licensing changes introduced with SQL Server 2022 and how Connection SAM services can support you in navigating these updates.

Key Licensing Attributes for SQL Server 2022

1. Price Increase: Effective January 2023, Microsoft implemented a 10% price increase for SQL Server licensing. The price hike is reflective of the added value and enhanced capabilities of SQL Server 2022, but it also means that organizations need to budget for higher licensing costs. This change underscores the importance of evaluating the total cost of ownership and the benefits provided by the new features and improvements in SQL Server 2022.

2. New Requirements for Virtual SQL Servers: One of the most impactful changes with SQL Server 2022 is the new requirement for Software Assurance (SA) for virtual SQL server environments. Previously, organizations could deploy SQL Server on virtual machines without necessarily needing SA. However, with the new rules, virtual deployments now require SA to be compliant.

3. Per Core Licensing Model: SQL Server 2022 continues to use the Per Core licensing model as an option. The per-core licensing model ensures that customers pay for the actual computing power they use, providing a fair and scalable approach to licensing. This consistency across different deployment environments simplifies the licensing process and helps organizations manage their licensing costs more effectively.

Enhanced Benefits with Software Assurance

With Software Assurance (SA) or subscription licenses, SQL Server customers unlock a range of valuable benefits. These benefits include the Azure Hybrid Benefit, which allows customers to use their existing SQL Server licenses for Azure deployments at a reduced cost. Additionally, SA provides rights for failover servers, disaster recovery, high availability, and more—including now, of course, the option to license by virtual machine. These benefits enhance the overall value of SQL Server licensing and provide organizations with greater flexibility and cost-saving opportunities.

The updated SQL licensing terms reflect Microsoft’s commitment to providing flexible, cloud-centric, and sometimes cost-effective licensing options. The changes in virtual machine licensing and the price increase are designed to encourage the adoption of Software Assurance and/or subscription licenses. The continued use of the Per Core licensing model and the enhanced benefits of Software Assurance further support organizations in managing their licensing costs and maximizing the value of their SQL Server investments. As organizations transition to SQL Server 2022, it is essential to understand these changes and evaluate their impact on licensing strategies, budgets, and compliance efforts.

Navigating these significant changes can be daunting. Connection is here to help. Our team of highly seasoned experts specialize in helping organizations understand and comply with volume licensing metrics, ensuring that you are fully informed on the latest licensing rules. Our offerings include:

  • Assessment and Analysis: We provide a detailed assessment of your current SQL Server licensing and usage, identifying any gaps or areas where you may be at risk of non-compliance.
  • Strategy Development: Based on our analysis, we help you develop a licensing strategy that aligns with your organizational goals and budget, ensuring you can leverage the full benefits of SQL Server 2022.
  • Cost Optimization: We identify opportunities to optimize your licensing costs, such as utilizing the Azure Hybrid Benefit or adjusting your deployment models to maximize efficiency and reduce expenses.
  • Compliance Guidance: We assess your licensing strategy and consult on how to comply with Microsoft’s latest rules and regulations, providing peace of mind and avoiding potential legal and financial penalties.
  • Ongoing Support: Our extended Microsoft team offers ongoing support and guidance, keeping you updated on any future changes to SQL Server licensing and helping you adapt your strategy as needed.

Reach Out Today!

Our Connection SAM services are designed to provide you with the expertise and support you need to navigate these updates successfully. Contact us to learn more about how we can help you understand and comply with these new licensing requirements.

Let us help you turn these changes into opportunities for growth and efficiency. Together, we can ensure that your organization is well-prepared and fully compliant with the latest SQL Server licensing rules.

]]>
TechSperience Episode 129: Boosting... https://community.connection.com/techsperience-episode-129-boosting-workplace-productivity-with-embedded-ai/ Nov 14, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/11/2857091-TechSperience-Workplace-Productivity-AI-BLOG.jpg

Embedded AI is revolutionizing industries by making advanced technology accessible to all organizations, driving innovation through applications like contract automation and proposal generation.

In this episode, we explore how our clients are leveraging embedded AI solutions and address key challenges, including ROI concerns and ethical considerations. Learn why it’s vital to start your AI journey today—and how Connection can help you stay ahead.

If you want to learn more about how Connection is helping its customers to improve productivity through AI Embedded AI, visit our online resources:
www.connection.com/solutions-services/artificial-intelligence
www.connection.com/solutions-services/digital-workspace

Speakers:

Brian Gatke – Vice President, GTM Solutions Strategy and Portfolio at Connection
Travis Cook – Director, Industry Solutions at Connection

Show Notes:

00:00 Introduction to AI Adoption Challenges
02:50 Understanding Embedded AI and Its Applications
06:06 The Democratization of AI
08:58 Navigating Hesitancy in AI Implementation
11:47 Exploring Use Cases for AI in Various Industries
15:09 The Importance of Clear Business Challenges
18:02 Ethical Considerations in AI Deployment
20:52 The Urgency of Getting Started with AI

]]>
The Importance of Embedded AI PCs in... https://community.connection.com/the-importance-of-embedded-ai-pcs-in-manufacturing/ Nov 05, 2024 James Rust https://community.connection.com/author/james-rust/ https://community.connection.com/wp-content/uploads/2024/10/2834040-ISG-Manufacturing-Embedded-AI-PCs-BLOG.jpg

When I worked at a smaller manufacturing operation, we migrated our ERP system from a program that resembled DOS to a cloud-based, modern-looking ERP that promised to have all kinds of new capabilities we would enjoy. Upon launching and utilizing it on the factory floor computers, the shipping department found that processing a single shipment in the new system took 20 extra minutes of computer work. The reason was simple—our infrastructure was not where it needed to be to handle this new software.

Hardware Requirements Can’t Be Neglected

The unfortunate truth is that this is not a rare occurrence. Software is only as good as the hardware it runs on. There’s no question that AI is the hottest topic in business these days, but like any software, it needs to be run on the proper hardware. Since the most familiar form of generative AI are public LLMs that can be run on practically any computer, it comes as a surprise to many that business AI use cases require powerful devices designed specifically for use with AI-powered software. That’s where embedded AI PCs come in.

Despite the name, embedded AI PCs do not usually come with AI applications installed. They are simply PCs (usually in a micro form factor) that are tailor-made to run AI-powered software and can quickly respond to input, making them crucial to certain operations. They can also be built to withstand the rugged conditions of any manufacturing environment.

AI has proven itself able to truly excel when given tasks involving pattern recognition, so most applications that yield good results are created with this in mind. Thus, it makes perfect sense that AI-powered camera vision can effectively perform quality control—it’s been trained that products must be a certain way, and anything that doesn’t conform to that pattern needs to be flagged. Even better, it can see these issues in real time during the production process, so manufacturers can easily determine when and where the defect was caused. If the same issues are popping up repeatedly, AI can also analyze the data and make process improvement recommendations.

 However, running this software with a traditional PC could mean it will take several minutes before a result is returned, and in a high throughput facility, this can be especially troublesome. For example, a quality control inspection performed by a human on a circuit board can take several minutes or even hours depending on the complexity of the product. AI programs running on hardware with the right processing power can utilize high-resolution cameras to check a board in under a minute, allowing for much faster processing and often more accurate test results. Such rapid testing means that it can also be done multiple times throughout the facility instead of one formal inspection at the end. The days of wasting precious time and resources on defective products that should have been pulled out five operations prior to final inspection can finally be over.

The Use Cases Are Limitless

Once you have hardware and software that can react quickly, the possibilities are endless—which can make it tough to know where to start. My advice is to tackle the persistent challenges in your business that are shared by most operations:  keeping your machines up and running and keeping automation reliable.

Factories have been aiming for true predictive maintenance for decades, but it’s been a very difficult challenge for most operations. However, once you’re equipped with devices that can quickly respond to machine conditions, the goal of zero breakdowns is within sight. Embedded AI PCs have the processing power needed to look at all your real-time factory data and immediately adjust machine operations to prevent errors and breakdowns. They can also automatically notify maintenance personnel of changing machine conditions so that repairs can be scheduled when the machine is not in use.

Considering that 76% of manufacturers are using automation in their factories, the value it can bring is undeniable. However, even simple automated operations can have errors and crashes because the machine is blindly following a program and will carry out those instructions regardless of the circumstances even when a human would know to stop. However, embedded AI PCs running your machines can adjust the program based on circumstances just the way a person would.

I once worked in a facility that involved a lot of liquid handling. Ultimately, all you needed to do was draw liquid from a source and dispense it in another location from a syringe-like container. This is extremely easy for anyone to do, but automated machines had issues due to variation in container size, container location, and several other factors. I spent days upon days fine tuning programs to handle all of the exceptions, but an embedded AI PC running the right software could have simply ran the basic program and seen “this is a different container, stop short of the container bottom and dispense liquid” or “the container is off center, adjust robot arm half an inch right,” and we could have run our operation with practically any variation. This same type of functionality can be applied to almost any operation that involves automation. All of this is only possible due to the near instantaneous reaction time offered by embedded AI PCs.

Don’t Go It Alone

If you’ve ever built your own computer, you know it can be frustrating and time-consuming making sure you have the right parts to run the software you need at a reasonable price, which is partially why prebuilt PCs have become so popular.

There’s no reason to add to the difficulty of implementing AI in your factory—that’s what Connection is here for. If you’re considering harnessing the power of AI on the manufacturing floor, engage our Manufacturing Practice today. We will make sure you have the right PCs for your specific use case so that everything will run as intended, and you can start reaping the benefits of this exciting new technology.

]]>
Boosting Cybersecurity for Schools and... https://community.connection.com/boosting-cybersecurity-for-schools-and-libraries-a-new-fcc-program/ Oct 22, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/10/2823899-Boosting-Cybersecurity-for-Schools-Libraries-BLOG.jpg

In today’s digital world, cybersecurity is more important than ever—especially in schools and libraries. To help these institutions stay protected, the Federal Communications Commission (FCC) has launched an exciting new initiative. On June 6, 2024, the FCC approved a three-year Cybersecurity Pilot Program, which will provide up to $200 million in support to schools and libraries across the U.S. This funding will help cover the costs of cybersecurity services and equipment, ensuring that these vital community hubs are better protected from evolving online threats.

What Is the Cybersecurity Pilot Program?

This program strives to ensure that schools and libraries have the tools they need to safeguard their networks and data. Over the next three years, the FCC will assess how effective it is to use federal funding to boost cybersecurity in these settings, with the potential to make this support permanent in the future. The ultimate goal is to create safer digital spaces for students, educators, and library users.

Who Can Apply?

If you’re part of a K–12 school or a public library, you’re eligible to apply! The FCC is looking for institutions that are ready to invest in cybersecurity and work together to strengthen their defenses. Whether you're in a large school district or a small community library, there’s a place for you in this program.

Program Funding and Budget Caps

Here’s how the budget breaks down:

  • Schools and School Districts (S&SDs):
    • Up to $45,000 for those with fewer than 1,100 students.
    • $40.80 per student for districts with between 1,101 and 110,294 students.
    • A maximum of $4.5 million for districts with more than 110,294 students.
  • Libraries and Library Systems:
    • Up to $45,000 for individual libraries not part of a larger system.
    • $45,000 per library for systems with fewer than 12 locations.
    • A maximum of $525,000 for larger systems with 12 or more branches.

Participants can apply for reimbursements as they spend their allocated funds, giving them flexibility to meet immediate and future cybersecurity needs.

What Can You Spend It On?

The program covers a range of services and equipment, including:

  • Advanced firewalls
  • Endpoint protection (like antivirus and anti-ransomware software)
  • Identity protection and authentication
  • Monitoring and response systems

Microsoft Solutions for the Program

Many Microsoft solutions align closely with the eligible services and equipment defined by the program:

  • Advanced/Next-generation Firewalls
  • Azure Firewall: A cloud-native, scalable firewall service that provides features like threat intelligence-based filtering, network segmentation, and DDoS protection.
  • Microsoft Defender for Cloud: A comprehensive security management tool offering threat detection and prevention, integrated with cloud-delivered intelligence.
  • Endpoint Protection
  • Microsoft Defender for Endpoint: Provides enterprise-grade security for end-user devices, including advanced threat detection, anti-malware, anti-ransomware, and extended detection and response (XDR).
  • Microsoft Intune: Provides mobile device management (MDM) and application management (MAM) for protecting devices in K–12 and library environments.
  • Microsoft Purview Data Loss Prevention (DLP): Safeguards data on devices against potential leaks or misuse.
  • Identity Protection and Authentication
  • Azure Active Directory (Azure AD): Enables identity protection and access management with features like MFA, Single Sign-On (SSO), and identity governance.
  • Microsoft Entra: Offers identity and access management tools including Privileged Identity Management (PIM) and Zero Trust architectures.
  • Microsoft Defender for Identity: Monitors and protects against identity-related threats in on-premises environments.
  • Monitoring, Detection, and Response
  • Microsoft Sentinel: A cloud-native SIEM and SOAR solution offering real-time threat monitoring, detection, and automated response.
  • Microsoft Defender for Cloud: Provides continuous security posture management, vulnerability scanning, and compliance assessment.
  • Azure Security Center: Offers advanced monitoring, threat intelligence, and incident response across hybrid environments.

How to Apply

The application process happens in two parts. In Part 1, you’ll submit an overview of your cybersecurity project. If selected, Part 2 will ask for more detailed information about your plans and previous cybersecurity efforts. The FCC expects applications to open this fall, so now is the time to start preparing!

Selection Process

The FCC will work with the Universal Service Administrative Company (USAC) to choose a variety of projects, ensuring that both large and small, urban and rural schools and libraries are included, with a particular focus on low-income and Tribal communities.

Getting Ready

To make sure you’re prepared:

  • Stay updated by reviewing the program’s details and FCC announcements.
  • Double-check your institution’s registrations.
  • Assess your cybersecurity needs and start planning.

This pilot program represents a vital investment in the security of schools and libraries, helping to create a safe and secure digital environment for everyone who relies on these important community institutions. For more information and updates, visit the FCC’s Cybersecurity Pilot Program website.

How Can Connection Help?

Connection's cybersecurity services provide comprehensive assessment and testing solutions, including penetration testing, security audits, and vulnerability scans, to identify and address risks within IT environments. Our team is dedicated to ensuring organizations remain compliant and secure against potential cyber threats, with support from key partners like Microsoft. By leveraging Microsoft's advanced services, we enhance the security and resilience of your organization's infrastructure. For more information, visit us online.

]]>
TechSperience Episode 128: AI-Powered... https://community.connection.com/techsperience-episode-128-ai-powered-retail-balancing-innovation-and-security/ Oct 21, 2024 Connection https://community.connection.com/author/connection/ https://community.connection.com/wp-content/uploads/2024/10/2827021-TechSperience-AIPoweredRetail-BLOG.jpg

AI is transforming the retail landscape, offering unprecedented opportunities to enhance efficiency, personalize the customer experience, and optimize operations. But with these opportunities come new challenges.

In this episode, we explore how AI can empower employees, drive hyper-personalization, and revolutionize supply chain management. We also delve into the critical security risks associated with AI adoption and discuss how retailers can balance innovation with robust security measures.

Speakers:

Brian Gallagher, Connection Retail Strategy Director
Rai Basharat, Connection Senior Director Data Orchestration, Helix Center for Applied AI and Robotics

Show Notes:

00:00 Introduction to AI in Retail
03:08 Top Benefits of AI for Retail
05:59 Enhancing Customer Experience with AI
08:52 AI's Role in Employee Productivity
11:59 AI in Supply Chain Optimization
14:48 Security Implications of AI in Retail
18:00 Balancing Innovation and Security
21:06 The Future of AI in Retail
23:52 The Human-AI Relationship in Retail
26:48 Conclusion and Next Steps

]]>
Boosting Productivity with AI https://community.connection.com/boosting-productivity-with-ai/ Oct 17, 2024 Brian Gatke https://community.connection.com/author/brian-gatke/ https://community.connection.com/wp-content/uploads/2024/10/2821151-Blog-Boosting-Productivity-AI-BLOG.jpg

Artificial intelligence (AI) is transforming workplaces across industries—boosting employee productivity with embedded tools. While AI offers countless use cases, the key to success lies in identifying the right applications and overcoming challenges like data security and integration complexity.

AI is still making waves at work, and with good reason. Across all industries, AI tools are expected to ratchet up employee productivity by an estimated 40%. Embedded tools like Microsoft Copilot or Adobe Firefly are making the biggest and earliest digital splash, injecting new efficiencies without feeling like a heavy lift.

In fact, 77% of us already use AI at work, but because embedded tools can feel so non-invasive, only a third of us realize it. At the same time, the use cases are so widespread that implementing AI with ROI in mind can feel a bit like running through a generative maze.

To help us plot a course, I spoke with two experts in the field: Dan Ortiz, Director of the Digital Workspace Center of Excellence at Connection, and Chi Chung, Connection’s Director of Solution Architects.

A Flood of Use Cases

Creating new policies and directives for something as far-reaching as AI can be resource intensive. AI has literally hundreds, if not thousands, of use cases in business, including:

  • AI-powered customer service chatbots and virtual assistants that resolve issues 24/7
  • Data analysis algorithms that quickly uncover trends, patterns, and insights
  • Automation of routine tasks like data entry scheduling and email management that free up valuable employee time

To double-click just one of those, early adopters are already using AI chatbots to give level-zero responses that solve customer problems at the first touch. These chatbots can be incredibly robust thanks to the wealth of information many orgs have developed over time.

“The takeaway here,” says Ortiz, “is that there are so many use cases, it’s incredibly important to be intentional about how we move forward.”

1. Lay the Groundwork

Chung explains that any organization that wants to increase productivity with AI should start by asking one important question: What use cases will benefit us most?

“That’s really the key,” says Chung.

To get there, Ortiz recommends that organizations workshop their AI goals and benefits, creating an AI task force or committee that represents all the different functions of the company. “That task force should then identify potential benefits and outline the success criteria.”

From there, the task force should define a clear action plan and determine the criteria they’ll need to meet before they implement.

2. Identify the Challenges

The next big step in the shortest path to AI productivity gains is to spotlight the challenges that are in our way. Among the biggest are data security, data silos, and integration complexity.

  • Data security: It’s mission-critical to keep your organization’s data security in compliance with privacy regulations and guidelines, even as AI steps in to help.
  • Data silos: Data is stored in different systems in every organization. This can make it very hard for AI to see the bigger picture, hampering its insights and analysis.
  • Integration complexity: Integrating data from multiple sources is complex and often requires custom connectors or additional software, increasing complexity and the cost of integration.
  • Data quality and consistency: If you don't have good data hygiene, you can’t keep your data quality consistent across all your different platforms. This creates a garbage-in, garbage-out scenario, as Chung points out.

“Start by identifying the data sets that will help you reach your target level of success,” Ortiz says. “Then, thoughtfully align your AI efforts with the benefits you'd like to get.”

3. Choose Between Embedded vs Custom AI Solutions

The next important choice is deciding on the right AI solution. Embedded AI tools like Copilot and Firefly are ready-made to tackle a wide variety of real-world business situations. Other custom-made solutions can cost more and take time to develop, but offer tailored features.

Chung and Ortiz encourage stakeholders to complete a technical readiness assessment before they settle on a tool. Fully evaluate your team’s data hygiene, identity, governance, access posture, technology gaps, and how you’ll prepare your user community to use AI tools. Chung adds, “It’s often best to start with embedded AI, then baby-step your way into a custom solution, using APIs and adding tailored features.”

4. Where to Start

Any new venture as complex as boosting productivity with AI can be daunting. That’s why it’s best to start by talking to an AI expert. Professionals like Ortiz or Chung can help you pinpoint where you are in your AI journey, what you can achieve, and the fastest way to get there.

“Connection has the best SMEs in the industry,” says Ortiz, “and we’d love to talk to you. Let us know what’s on your mind and let us work for you.” With the right team on your side, realizing real productivity gains with AI isn’t as challenging as you might think.

To hear more from Ortiz and Chung about implementing AI solutions, you can watch the full webinar.

Explore our AI solutions and Digital Workspace services today by contacting an expert at 1.800.998.0067

]]>