谷歌浏览器插件
订阅小程序
在清言上使用

Fog Computing May Help to Save Energy in Cloud Computing.

IEEE Journal on Selected Areas in Communications(2016)

引用 467|浏览61
暂无评分
摘要
Tiny computers located in end-user premises are becoming popular as local servers for Internet of Things (IoT) and Fog computing services. These highly distributed servers that can host and distribute content and applications in a peer-to-peer (P2P) fashion are known as nano data centers (nDCs). Despite the growing popularity of nano servers, their energy consumption is not well-investigated. To study energy consumption of nDCs, we propose and use flow-based and time-based energy consumption models for shared and unshared network equipment, respectively. To apply and validate these models, a set of measurements and experiments are performed to compare energy consumption of a service provided by nDCs and centralized data centers (DCs). A number of findings emerge from our study, including the factors in the system design that allow nDCs to consume less energy than its centralized counterpart. These include the type of access network attached to nano servers and nano server's time utilization (the ratio of the idle time to active time). Additionally, the type of applications running on nDCs and factors such as number of downloads, number of updates, and amount of preloaded copies of data influence the energy cost. Our results reveal that number of hops between a user and content has little impact on the total energy consumption compared to the above-mentioned factors. We show that nano servers in Fog computing can complement centralized DCs to serve certain applications, mostly IoT applications for which the source of data is in end-user premises, and lead to energy saving if the applications (or a part of them) are off-loadable from centralized DCs and run on nDCs.
更多
查看译文
关键词
Servers,Energy consumption,Power demand,Data models,Peer-to-peer computing,Distributed databases,Energy measurement
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要