版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領
文檔簡介
1、Data Structure and Algorithm Analysis06: Priority Queue (Heaps) :/ /course/cs202/2014course/cs202/2014 Hongfei YanSchool of EECS, Peking University4/23/2014Contents01 Programming: A General Overview (20-65) 02 Algorithm Analysis (70-89)03 Lists, Stacks, and Queues (96-135)04 Trees (140-20
2、0)05 Hashing (212-255)06 Priority Queues (Heaps) (264-302)07 Sorting (310-360)08 The Disjoint Sets Class (370-393)09 Graph Algorithms (398-456)10 Algorithm Design Techniques (468-537)MotivationJobs sent to a printer are generally placed on a queue, one job might be particularly importantConversely,
3、if there are several 1jobs and one 100job, it might be reasonable to make the long job go last, even if it is not the last job submitted.In a multiuser environment, the operating system scheduler must decide which of several processes to run.Generally, it is important that short jobs finish as fast
4、as possiblesome jobs that are not short are still very important and should also have precedence.require a special kind of queue, known as a priority queue.02 Priority Queues (Heaps)6.1 Model6.2 Simple Implementations6.3 Binary Heap6.4 Applications of Priority Queues6.5 d-Heaps6.6 Leftist Heaps6.7 S
5、kew Heaps6.8 Binomial Queues6.9 Priority Queues in the Standard LibraryBasic model of a priority queueallows at least the following two operations:insert, which does the obvious thing; And deleteMin, which finds, returns, and removes the minimum element in the priority queue.02 Priority Queues (Heap
6、s)6.1 Model6.2 Simple Implementations6.3 Binary Heap6.4 Applications of Priority Queues6.5 d-Heaps6.6 Leftist Heaps6.7 Skew Heaps6.8 Binomial Queues6.9 Priority Queues in the Standard Libraryseveral obvious ways to implement a priority queueuse a simple linked list, performing insertions at the fron
7、t in O(1) and traversing the list, which requires O(N) time, to delete the minimum. Alternatively, the list be kept always sorted; this makes insertions expensive (O(N) and deleteMins cheap (O(1). The former is probably the better idea of the two, based on the fact that there are never more deleteMi
8、ns than insertions.Another way of implementing priority queues would be to use a binary search tree.This gives an O(logN) average running time for both operations.Repeatedly removing a node that is in the left subtree would seem to hurt the balance of the tree by making the right subtree heavyUsing
9、a search tree could be overkill because it supports a host of operations that are not required.02 Priority Queues (Heaps)6.1 Model6.2 Simple Implementations6.3 Binary Heap6.4 Applications of Priority Queues6.5 d-Heaps6.6 Leftist Heaps6.7 Skew Heaps6.8 Binomial Queues6.9 Priority Queues in the Standa
10、rd LibraryBinary Heap merely as HeapThe priority queue is known as a binary heap.Heaps have two properties, namely, a structure property and a heaporder property. As with AVL trees, an operation on a heap can destroy one of the properties,so a heap operation must not terminate until all heap propert
11、ies are in order. 6.3 Binary Heap6.3.1 Structure Property6.3.2 Heap-Order Property6.3.3 Basic Heap Operations6.3.4 Other Heap OperationsComplete Binary TreeA heap is a binary tree that is completely filled, with the possible exception of the bottom level, which is filled from left to right. Such a t
12、ree is known as a complete binary tree.a complete binary tree of height h has between 2h and 2h+1 1 nodes. This implies that the height of a complete binary tree is , which is clearly O(logN).Array implementation of complete binary tree (1/2)An important observation is that because a complete binary
13、 tree is so regular, it can be represented in an array and no links are necessary.Array implementation of complete binary tree (2/2)For any element in array position i, the left child is in position 2i, the right child is in the cell after the left child (2i + 1), and the parent is in position i/2.
14、Thus, not only are links not required, but the operations required to traverse the tree are extremely simple and likely to be very fast on most computers. The only problem with this implementation is that an estimate of the maximum heap size is required in advance, but typically this is not a proble
15、m (and we can resize if needed).A heap data structure will, then, consist of an array (of Comparable objects) and an integer representing the current heap size.we shall draw the heaps as trees, with the implication that an actual implementation will use simple arrays.a priority queue interface6.3.2
16、Heap-Order PropertyThe property that allows operations to be performed quickly is the heap-order property.Since we want to be able to find the minimum quickly, it makes sense that the smallest element should be at the root. If we consider that any subtree should also be a heap, then any node should
17、be smaller than all of its descendants.6.3 Binary Heap6.3.1 Structure Property6.3.2 Heap-Order Property6.3.3 Basic Heap Operations6.3.4 Other Heap OperationsInsertion operationTo insert an element X into the heap, we create a hole in the next available locationThis general strategy is known as a per
18、colate up; the new element is percolated up the heap until the correct location is found.Procedure to insert into a binary heapdeleteMindeleteMins are handled in a similar manner as insertions. Finding the minimum is easy; the hard part is removing it. When the minimum is removed, a hole is created
19、at the root.Since the heap now becomes one smaller, it follows that the last element X in the heap must move somewhere in the heap. If X can be placed in the hole, then we are done. This is unlikely, so we slide the smaller of the holes children into the hole, thus pushing the hole down one level. W
20、e repeat this step until X can be placed in the hole. Thus, our action is to place X in its correct spot along a path from the root containing minimum children.Method to perform deleteMin in a binary heap (1/2)Method to perform deleteMin in a binary heap (2/2)A frequent implementation error in heaps
21、 occurs when there are an even number of elements in the heap, and the one node that has only one child is encountered.You must make sure not to assume that there are always two children, so this usually involves an extra test.6.3.4 Other Heap Operationsa heap has very little ordering information no
22、 way to find any particular element without a linear scan through the entire heap.if it is important to know where elements are,some other data structure, such as a hash table, must be used in addition to the heap.If we assume that the position of every element is known by some other method, then se
23、veral other operations become cheap. The first three operations below all run in logarithmic worst-case time.decreaseKeyThe decreaseKey(p,) operation lowers the value of the item at position p by a positive amount .Since this might violate the heap order, it must be fixed by a percolate up. This ope
24、ration could be useful to system administrators: They can make their programs run with highest priorityincreaseKeyThe increaseKey(p, ) operation increases the value of the item at position p by a positive amount . This is done with a percolate down. Many schedulers automatically drop the priority of
25、 a process that is consuming excessive CPU time.removeThe remove(p) operation removes the node at position p from the heap. This is done by first performing decreaseKey(p,) and then performing deleteMin(). When a process is terminated by a user (instead of finishing normally), it must be removed fro
26、m the priority queue.buildHeap (1/2)constructed from an initial collection of items. This constructor takes as input N items and places them into a heap. Obviously, this can be done with N successive inserts. Since each insert will take O(1) average and O(logN) worst-case time, the total running tim
27、e of this algorithm would be O(N) average but O(N logN) worst-case.we already know that the instruction can be performed in linear average timebuildHeap (2/2)The general algorithm is to place the N items into the tree in any order, maintaining the structure property. Then, if percolateDown(i) percol
28、ates down from node i, the buildHeap routine in Figure 6.14 can be used by the constructor to create a heap-ordered tree.To bound the running time of buildHeap, we must bound the number of dashed lines.This can be done by computing the sum of the heights of all the nodes in the heap, which is the ma
29、ximum number of dashed lines. What we would like to show is that this sum is O(N).buildHeap and constructorFor the perfect binary tree of height h containing 2h+11 nodes, the sum of the heights of the nodes is 2h+1 1 (h + 1).ProofIt is easy to see that this tree consists of 1 node at height h, 2 nod
30、es at height h 1, 22 nodes at height h 2, and in general 2i nodes at height h i. The sum of the heights of all the nodes is thenbuildHeap is linearA complete tree is not a perfect binary tree, but the result we have obtained is an upper bound on the sum of the heights of the nodes in a complete tree
31、. Since a complete tree has between 2h and 2h+1 nodes, this theorem implies that this sum is O(N), where N is the number of nodes.Although the result we have obtained is sufficient to show that buildHeap is linear, the bound on the sum of the heights is not as strong as possible. For a complete tree
32、 with N = 2h nodes, the bound we have obtained is roughly 2N.The sum of the heights can be shown by induction to be N b(N), where b(N) is the number of 1s in the binary representation of N.02 Priority Queues (Heaps)6.1 Model6.2 Simple Implementations6.3 Binary Heap6.4 Applications of Priority Queues
33、6.5 d-Heaps6.6 Leftist Heaps6.7 Skew Heaps6.8 Binomial Queues6.9 Priority Queues in the Standard Library6.4.1 The Selection ProblemThe input is a list of N elements, which can be totally ordered, and an integer k. The selection problem is to find the kth largest element.algorithm 1A, is to read the
34、elements into an array and sort them, returning the appropriate element. Assuming a simple sorting algorithm, the running time is O(N2). The 1B, is to read k elements into an array and sort them. process the remaining elements one by one. As an element arrives, it is compared with the kth element in
35、 the array. If it is larger, then the kth element is removed, and the new element is placed in the correct place among the remaining k1 elements. When the algorithm ends, the element in the kth position is the answer. The running time is O(Nk)If , then both algorithms are O(N2).This also happens to
36、be the most interesting case, since this value of k is known as the median.MedianThe middle number (in a sorted list of numbers). To find the Median, place the numbers you are given in value order and find the middle number.Example: find the Median of 13, 23, 11, 16, 15, 10, 26. Put them in order: 1
37、0, 11, 13, 15, 16, 23, 26The middle number is 15, so the median is 15.(If there are two middle numbers, you average them.)Algorithm 6Aassume that we are interested in finding the kth smallest element.read the N elements into an array. apply the buildHeap algorithm to this array. perform k deleteMin
38、operations.obtain a total running time of O(N + k logN). If k = O(N/logN), then the running time is dominated by the buildHeap operation and is O(N). For larger values of k, the running time is O(k logN). If , then the running time is (N logN).Notice that if we run this program for k = N and record
39、the values as they leave the heap, we will have essentially sorted the input file in O(N logN) time.Algorithm 6B (1/2)maintain a set S of the k largest elements. After the first k elements are read, when a new element is read it is compared with the kth largest element, which we denote by Sk. Notice
40、 that Sk is the smallest element in S. If the new element is larger, then it replaces Sk in S. S will then have a new smallest element, which may or may not be the newly added element. At the end of the input, we find the smallest element in S and return it as the answer.Algorithm 6B (2/2)The first
41、k elements are placed into the heap in total time O(k) with a call to buildHeap. The time to process each of the remaining elements is O(1), to test if the element goes into S, plus O(log k), to delete Sk and insert the new element if this is necessary. Thus, the total time is O(k + (N k) log k) = O
42、(N log k). This algorithm also gives a bound of (N logN) for finding the median.6.4.2 Event Simulation (1/4)a bank, where customers arrive and wait in a line until one of k tellers is available.Customer arrival is governed by a probability distribution function, as is the service timethe bank office
43、rs wanna determine how many tellers are needed to ensure reasonably smooth service.We are interested in statistics such as how long on average a customer has to wait or how long the line might be.With certain probability distributions and values of k, these answers can be computed exactly. However,
44、as k gets larger, the analysis becomes considerably more difficult, so it is appealing to use a computer to simulate the operation of the bank.Event Simulation (2/4)A simulation consists of processing events. The two events here are (a) a customer arriving and (b) a customer departing, thus freeing
45、up a teller.use the probability functions to generate an input stream consisting of ordered pairs of arrival time and service time for each customer, sorted by arrival time.At any point, the next event that can occur is either (a) the next customer in the input file arrives or (b) one of the custome
46、rs at a teller leaves.Since all the times when the events will happen are available, we just need to find the event that happens nearest in the future and process that event.Event Simulation (3/4)If the event is a departure, processing includes gathering statistics for the departing customer and che
47、cking the line (queue) to see whether there is another customer waiting. If so, we add that customer, process whatever statistics are required, compute the time when that customer will leave, and add that departure to the set of events waiting to happen.If the event is an arrival, we check for an av
48、ailable teller. If there is none, we place the arrival on the line (queue);otherwise we give the customer a teller, compute the customers departure time, and add the departure to the set of events waiting to happen.Event Simulation (4/4)Since we need to find the event nearest in the future, it is ap
49、propriate that the set of departures waiting to happen be organized in a priority queue.If there are C customers (and thus 2C events) and k tellers, then the running time of the simulation would be O(C log(k + 1) because computing and processing each event takes O(logH), where H = k + 1 is the size
50、of the heap.02 Priority Queues (Heaps)6.1 Model6.2 Simple Implementations6.3 Binary Heap6.4 Applications of Priority Queues6.5 d-Heaps6.6 Leftist Heaps6.7 Skew Heaps6.8 Binomial Queues6.9 Priority Queues in the Standard Libraryd-Heapsd-heap, which is exactly like a binary heap except that all nodes
51、have d children (thus, a binary heap is a 2-heap).the running time of inserts to O(logd N).the time for deleteMin to O(d logd N).when the priority queue is too large to fit entirely in main memory, a d-heap can be advantageous in much the same way as B-trees. 02 Priority Queues (Heaps)6.1 Model6.2 S
52、imple Implementations6.3 Binary Heap6.4 Applications of Priority Queues6.5 d-Heaps6.6 Leftist Heaps6.7 Skew Heaps6.8 Binomial Queues6.9 Priority Queues in the Standard LibraryLeftist Heapscombining two heaps into one is a hard operationall the advanced data structures that support efficient merging
53、require the use of a linked data structureLike a binary heap, a leftist heap has both a structural property and an ordering property.The only difference between a leftist heap and a binary heap is that leftist heaps are not perfectly balanced, but actually attempt to be very unbalanced.Leftist Heap
54、Propertydefine the null path length, npl(X), of any node X to be the length of the shortest path from X to a node without two children. Thus, the npl of a node with zero or one child is 0, while npl(nullptr) = 1.The leftist heap property is that for every node X in the heap, the null path length of
55、the left child is at least as large as that of the right child.A leftist tree with r nodes on the right path must have at least 2r 1 nodes.Proof The proof is by induction. If r = 1, there must be at least one tree node. Otherwise, suppose that the theorem is true for 1, 2, . . . , r. Consider a left
56、ist tree with r+1 nodes on the right path. Then the root has a right subtree with r nodes on the right path, and a left subtree with at least r nodes on the right path (otherwise it would not be leftist). Applying the inductive hypothesis to these subtrees yields a minimum of 2r 1 nodes in each subt
57、ree. This plus the root gives at least 2r+1 1 nodes in the treeFrom this theoremit follows immediately that a leftist tree of N nodes has a right path containing at most a nodes. The general idea for the leftist heap operations is to perform all the work on the right path, which is guaranteed to be
58、short. The only tricky part is that performing inserts and merges on the right path could destroy the leftist heap property. It turns out to be extremely easy to restore the property.Leftist Heap Operation (1/)The fundamental operation on leftist heaps is merging.recursively merge the heap with the
59、larger root with the right subheap of the heap with the smaller root.Leftist Heap Operation (2/)Leftist Heap Operation (3/)Leftist Heap Operation (4/)02 Priority Queues (Heaps)6.1 Model6.2 Simple Implementations6.3 Binary Heap6.4 Applications of Priority Queues6.5 d-Heaps6.6 Leftist Heaps6.7 Skew He
60、aps6.8 Binomial Queues6.9 Priority Queues in the Standard LibrarySkew HeapsA skew heap is a self-adjusting version of a leftist heap that is incredibly simple to implement.The relationship of skew heaps to leftist heaps is analogous to the relation between splay trees and AVL trees. Skew heaps are b
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
- 6. 下載文件中如有侵權(quán)或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 小學四年級語文上冊教學計劃集合3篇
- 文化旅游產(chǎn)業(yè)二手房買賣協(xié)議
- 特色小鎮(zhèn)案例之遠洋漁業(yè)小鎮(zhèn)
- 圖書館兼職管理員協(xié)議
- 休閑食品總經(jīng)理招聘協(xié)議
- 水利工程配套深水井施工合同
- 云計算班主任崗位協(xié)議
- 皮革制品公司CEO聘任合同
- 影視制作混凝土施工協(xié)議
- 2024年廣西事業(yè)單位與社會力量合作合同
- 客服話術大全-
- 干果加工項目建議書范文
- 人教版初中語文教材分析(課堂PPT)
- 護理核心制度督查表20179
- 紅色古色綠色文化教育活動策劃方案
- 《Monsters 怪獸》中英對照歌詞
- 《正交分解法》導學案
- 建筑材料知識點匯總
- 平面構(gòu)成作品欣賞
- 英語管道專業(yè)術語
- 社會工作畢業(yè)論文(優(yōu)秀范文8篇)
評論
0/150
提交評論